datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
anonymousparrot01/SubmissionData | ---
annotations_creators: []
language:
- en
language_creators: []
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
pretty_name: CompanyWeb
size_categories:
- 1M<n<10M
source_datasets: []
tags:
- business
- company websites
task_categories:
- fill-mask
- other
task_ids:
- masked-language-modeling
---
# Dataset Card for "CompanyWeb"
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [PLACEHOLDER]()
- **Repository:** [PLACEHOLDER]()
- **Paper:** [PLACEHOLDER]()
- **Leaderboard:** [PLACEHOLDER]()
- **Point of Contact:** [PLACEHOLDER]()
### Dataset Summary
The dataset contains textual content extracted from 1,788,413 company web pages of 393,542 companies. The companies included in the dataset are small, medium and large international enterprises including publicly listed companies. Additional company information is provided in form of the corresponding Standard Industry Classification (SIC) label `sic4`.
The text includes all textual information contained on the website with a timeline ranging from 2014 to 2021. The search includes all subsequent pages with links from the homepage containing the company domain name.
We filter the resulting textual data to only include English text utilizing the FastText language detection API [(Joulin et al., 2016)](https://aclanthology.org/E17-2068/).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
- en
## Dataset Structure
### Data Instances
- **#Instances:** 1789413
- **#Companies:** 393542
- **#Timeline:** 2014-2021
### Data Fields
- `id`: instance identifier `(string)`
- `cid`: company identifier `(string)`
- `text`: website text `(string)`
- `sic4`: 4-digit SIC `(string)`
### Data Splits
[PLACEHOLDER]
## Dataset Creation
### Curation Rationale
[PLACEHOLDER]
### Source Data
#### Initial Data Collection and Normalization
[PLACEHOLDER]
#### Who are the source language producers?
[PLACEHOLDER]
### Annotations
#### Annotation process
[PLACEHOLDER]
#### Who are the annotators?
[PLACEHOLDER]
### Personal and Sensitive Information
[PLACEHOLDER]
## Considerations for Using the Data
### Social Impact of Dataset
[PLACEHOLDER]
### Discussion of Biases
[PLACEHOLDER]
### Other Known Limitations
[PLACEHOLDER]
## Additional Information
### Dataset Curators
[PLACEHOLDER]
### Licensing Information
[PLACEHOLDER]
### Citation Information
```bibtex
@misc{title_year,
title={TITLE},
author={AUTHORS},
year={YEAR},
}
```
### Contributions
[PLACEHOLDER]
<!-- Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. --> |
divi7007/openassistant | ---
license: other
---
|
WeBots/WeDataset | ---
license: mit
---
|
coggpt/ParaPat | ---
license: mit
---
This repository contains the developed parallel corpus from the open access Google Patents dataset in 74 language pairs, comprising more than 68 million sentences and 800 million tokens. Sentences were automatically aligned using the Hunalign algorithm for the largest 22 language pairs, while the others were abstract (i.e. paragraph) aligned. We demonstrate the capabilities of our corpus by training Neural Machine Translation (NMT) models for the main 9 language pairs, with a total of 18 models. Our parallel corpus is freely available in TSV format.
https://figshare.com/articles/dataset/ParaPat_The_Multi-Million_Sentences_Parallel_Corpus_of_Patents_Abstracts/12627632 |
fathyshalab/reklamation24_supermaerkte-drogerien-full | ---
dataset_info:
features:
- name: text
dtype: string
- name: inputs
struct:
- name: text
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
dtype: string
- name: annotation_agent
dtype: string
- name: vectors
struct:
- name: mini-lm-sentence-transformers
sequence: float64
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 84943744
num_examples: 15366
download_size: 0
dataset_size: 84943744
---
# Dataset Card for "reklamation24_supermaerkte-drogerien-full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/lava_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lava/ラヴァ/炎熔 (Arknights)
This is the dataset of lava/ラヴァ/炎熔 (Arknights), containing 66 images and their tags.
The core tags of this character are `purple_hair, horns, purple_eyes, pointy_ears, demon_horns, twintails, braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 66 | 107.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lava_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 66 | 91.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lava_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 168 | 187.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lava_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lava_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, long_hair, solo, black_dress, breasts, official_alternate_costume, looking_at_viewer, sleeveless_dress, bare_shoulders, bracelet, earrings, holding_fan, nail_polish, sitting, shawl, thigh_strap, closed_mouth, hair_ornament, purple_nails, black_necktie, feet_out_of_frame, parted_lips, white_background |
| 1 | 13 |  |  |  |  |  | 1girl, solo, black_jacket, looking_at_viewer, official_alternate_costume, black_necktie, closed_mouth, shirt, simple_background, black_gloves, white_background, open_jacket, short_sleeves, holding, hooded_jacket |
| 2 | 25 |  |  |  |  |  | 1girl, solo, black_shirt, necklace, bare_shoulders, collarbone, looking_at_viewer, off_shoulder, short_twintails, navel, purple_skirt, short_hair, short_sleeves, midriff, jacket, pantyhose, simple_background, white_background, book, ear_piercing, holding_knife, ring, stomach |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_hair | solo | black_dress | breasts | official_alternate_costume | looking_at_viewer | sleeveless_dress | bare_shoulders | bracelet | earrings | holding_fan | nail_polish | sitting | shawl | thigh_strap | closed_mouth | hair_ornament | purple_nails | black_necktie | feet_out_of_frame | parted_lips | white_background | black_jacket | shirt | simple_background | black_gloves | open_jacket | short_sleeves | holding | hooded_jacket | black_shirt | necklace | collarbone | off_shoulder | short_twintails | navel | purple_skirt | short_hair | midriff | jacket | pantyhose | book | ear_piercing | holding_knife | ring | stomach |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:-------|:--------------|:----------|:-----------------------------|:--------------------|:-------------------|:-----------------|:-----------|:-----------|:--------------|:--------------|:----------|:--------|:--------------|:---------------|:----------------|:---------------|:----------------|:--------------------|:--------------|:-------------------|:---------------|:--------|:--------------------|:---------------|:--------------|:----------------|:----------|:----------------|:--------------|:-----------|:-------------|:---------------|:------------------|:--------|:---------------|:-------------|:----------|:---------|:------------|:-------|:---------------|:----------------|:-------|:----------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | | X | | | X | X | | | | | | | | | | X | | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 2 | 25 |  |  |  |  |  | X | | X | | | | X | | X | | | | | | | | | | | | | | X | | | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
renhj/test2 | ---
license: apache-2.0
---
|
AWeirdDev/zh-tw-recipes-sm | ---
dataset_info:
features:
- name: image
dtype: string
- name: title
dtype: string
- name: descriotion
dtype: string
- name: cooking_time
dtype: string
- name: author
dtype: string
- name: url
dtype: string
- name: servings
dtype: int64
- name: ingredients
list:
- name: name
dtype: string
- name: unit
dtype: string
- name: steps
dtype: string
splits:
- name: train
num_bytes: 2205483
num_examples: 1799
download_size: 1079443
dataset_size: 2205483
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- text-generation
tags:
- recipe
license: mit
language:
- zh
size_categories:
- 1K<n<10K
--- |
musicakamusic/Afro | ---
license: gpl-3.0
---
|
mandarjoshi/trivia_qa | ---
annotations_creators:
- crowdsourced
language_creators:
- machine-generated
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
- 100K<n<1M
source_datasets:
- original
task_categories:
- question-answering
- text2text-generation
task_ids:
- open-domain-qa
- open-domain-abstractive-qa
- extractive-qa
- abstractive-qa
paperswithcode_id: triviaqa
pretty_name: TriviaQA
dataset_info:
- config_name: rc
features:
- name: question
dtype: string
- name: question_id
dtype: string
- name: question_source
dtype: string
- name: entity_pages
sequence:
- name: doc_source
dtype: string
- name: filename
dtype: string
- name: title
dtype: string
- name: wiki_context
dtype: string
- name: search_results
sequence:
- name: description
dtype: string
- name: filename
dtype: string
- name: rank
dtype: int32
- name: title
dtype: string
- name: url
dtype: string
- name: search_context
dtype: string
- name: answer
struct:
- name: aliases
sequence: string
- name: normalized_aliases
sequence: string
- name: matched_wiki_entity_name
dtype: string
- name: normalized_matched_wiki_entity_name
dtype: string
- name: normalized_value
dtype: string
- name: type
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 12749651131
num_examples: 138384
- name: validation
num_bytes: 1662321188
num_examples: 17944
- name: test
num_bytes: 1577710503
num_examples: 17210
download_size: 8998808983
dataset_size: 15989682822
- config_name: rc.nocontext
features:
- name: question
dtype: string
- name: question_id
dtype: string
- name: question_source
dtype: string
- name: entity_pages
sequence:
- name: doc_source
dtype: string
- name: filename
dtype: string
- name: title
dtype: string
- name: wiki_context
dtype: string
- name: search_results
sequence:
- name: description
dtype: string
- name: filename
dtype: string
- name: rank
dtype: int32
- name: title
dtype: string
- name: url
dtype: string
- name: search_context
dtype: string
- name: answer
struct:
- name: aliases
sequence: string
- name: normalized_aliases
sequence: string
- name: matched_wiki_entity_name
dtype: string
- name: normalized_matched_wiki_entity_name
dtype: string
- name: normalized_value
dtype: string
- name: type
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 106882730
num_examples: 138384
- name: validation
num_bytes: 14059830
num_examples: 17944
- name: test
num_bytes: 3667903
num_examples: 17210
download_size: 63926518
dataset_size: 124610463
- config_name: rc.web
features:
- name: question
dtype: string
- name: question_id
dtype: string
- name: question_source
dtype: string
- name: entity_pages
sequence:
- name: doc_source
dtype: string
- name: filename
dtype: string
- name: title
dtype: string
- name: wiki_context
dtype: string
- name: search_results
sequence:
- name: description
dtype: string
- name: filename
dtype: string
- name: rank
dtype: int32
- name: title
dtype: string
- name: url
dtype: string
- name: search_context
dtype: string
- name: answer
struct:
- name: aliases
sequence: string
- name: normalized_aliases
sequence: string
- name: matched_wiki_entity_name
dtype: string
- name: normalized_matched_wiki_entity_name
dtype: string
- name: normalized_value
dtype: string
- name: type
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 9408851139
num_examples: 76496
- name: validation
num_bytes: 1232155138
num_examples: 9951
- name: test
num_bytes: 1171663999
num_examples: 9509
download_size: 6626625832
dataset_size: 11812670276
- config_name: rc.web.nocontext
features:
- name: question
dtype: string
- name: question_id
dtype: string
- name: question_source
dtype: string
- name: entity_pages
sequence:
- name: doc_source
dtype: string
- name: filename
dtype: string
- name: title
dtype: string
- name: wiki_context
dtype: string
- name: search_results
sequence:
- name: description
dtype: string
- name: filename
dtype: string
- name: rank
dtype: int32
- name: title
dtype: string
- name: url
dtype: string
- name: search_context
dtype: string
- name: answer
struct:
- name: aliases
sequence: string
- name: normalized_aliases
sequence: string
- name: matched_wiki_entity_name
dtype: string
- name: normalized_matched_wiki_entity_name
dtype: string
- name: normalized_value
dtype: string
- name: type
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 58523085
num_examples: 76496
- name: validation
num_bytes: 7694557
num_examples: 9951
- name: test
num_bytes: 2024747
num_examples: 9509
download_size: 35123473
dataset_size: 68242389
- config_name: rc.wikipedia
features:
- name: question
dtype: string
- name: question_id
dtype: string
- name: question_source
dtype: string
- name: entity_pages
sequence:
- name: doc_source
dtype: string
- name: filename
dtype: string
- name: title
dtype: string
- name: wiki_context
dtype: string
- name: search_results
sequence:
- name: description
dtype: string
- name: filename
dtype: string
- name: rank
dtype: int32
- name: title
dtype: string
- name: url
dtype: string
- name: search_context
dtype: string
- name: answer
struct:
- name: aliases
sequence: string
- name: normalized_aliases
sequence: string
- name: matched_wiki_entity_name
dtype: string
- name: normalized_matched_wiki_entity_name
dtype: string
- name: normalized_value
dtype: string
- name: type
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 3340799992
num_examples: 61888
- name: validation
num_bytes: 430166050
num_examples: 7993
- name: test
num_bytes: 406046504
num_examples: 7701
download_size: 2293374081
dataset_size: 4177012546
- config_name: rc.wikipedia.nocontext
features:
- name: question
dtype: string
- name: question_id
dtype: string
- name: question_source
dtype: string
- name: entity_pages
sequence:
- name: doc_source
dtype: string
- name: filename
dtype: string
- name: title
dtype: string
- name: wiki_context
dtype: string
- name: search_results
sequence:
- name: description
dtype: string
- name: filename
dtype: string
- name: rank
dtype: int32
- name: title
dtype: string
- name: url
dtype: string
- name: search_context
dtype: string
- name: answer
struct:
- name: aliases
sequence: string
- name: normalized_aliases
sequence: string
- name: matched_wiki_entity_name
dtype: string
- name: normalized_matched_wiki_entity_name
dtype: string
- name: normalized_value
dtype: string
- name: type
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 48359645
num_examples: 61888
- name: validation
num_bytes: 6365273
num_examples: 7993
- name: test
num_bytes: 1643156
num_examples: 7701
download_size: 28803950
dataset_size: 56368074
- config_name: unfiltered
features:
- name: question
dtype: string
- name: question_id
dtype: string
- name: question_source
dtype: string
- name: entity_pages
sequence:
- name: doc_source
dtype: string
- name: filename
dtype: string
- name: title
dtype: string
- name: wiki_context
dtype: string
- name: search_results
sequence:
- name: description
dtype: string
- name: filename
dtype: string
- name: rank
dtype: int32
- name: title
dtype: string
- name: url
dtype: string
- name: search_context
dtype: string
- name: answer
struct:
- name: aliases
sequence: string
- name: normalized_aliases
sequence: string
- name: matched_wiki_entity_name
dtype: string
- name: normalized_matched_wiki_entity_name
dtype: string
- name: normalized_value
dtype: string
- name: type
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 23292199425
num_examples: 87622
- name: validation
num_bytes: 3038803743
num_examples: 11313
- name: test
num_bytes: 2906455311
num_examples: 10832
download_size: 16695552268
dataset_size: 29237458479
- config_name: unfiltered.nocontext
features:
- name: question
dtype: string
- name: question_id
dtype: string
- name: question_source
dtype: string
- name: entity_pages
sequence:
- name: doc_source
dtype: string
- name: filename
dtype: string
- name: title
dtype: string
- name: wiki_context
dtype: string
- name: search_results
sequence:
- name: description
dtype: string
- name: filename
dtype: string
- name: rank
dtype: int32
- name: title
dtype: string
- name: url
dtype: string
- name: search_context
dtype: string
- name: answer
struct:
- name: aliases
sequence: string
- name: normalized_aliases
sequence: string
- name: matched_wiki_entity_name
dtype: string
- name: normalized_matched_wiki_entity_name
dtype: string
- name: normalized_value
dtype: string
- name: type
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 63300226
num_examples: 87622
- name: validation
num_bytes: 8296870
num_examples: 11313
- name: test
num_bytes: 2320660
num_examples: 10832
download_size: 38364033
dataset_size: 73917756
- config_name: unfiltered.web
features:
- name: question
dtype: string
- name: question_id
dtype: string
- name: question_source
dtype: string
- name: entity_pages
sequence:
- name: doc_source
dtype: string
- name: filename
dtype: string
- name: title
dtype: string
- name: wiki_context
dtype: string
- name: search_results
sequence:
- name: description
dtype: string
- name: filename
dtype: string
- name: rank
dtype: int32
- name: title
dtype: string
- name: url
dtype: string
- name: search_context
dtype: string
- name: answer
struct:
- name: aliases
sequence: string
- name: normalized_aliases
sequence: string
- name: matched_wiki_entity_name
dtype: string
- name: normalized_matched_wiki_entity_name
dtype: string
- name: normalized_value
dtype: string
- name: type
dtype: string
- name: value
dtype: string
splits:
- name: train
- name: validation
- name: test
download_size: 3298328560
dataset_size: 0
- config_name: unfiltered.web.nocontext
features:
- name: question
dtype: string
- name: question_id
dtype: string
- name: question_source
dtype: string
- name: entity_pages
sequence:
- name: doc_source
dtype: string
- name: filename
dtype: string
- name: title
dtype: string
- name: wiki_context
dtype: string
- name: search_results
sequence:
- name: description
dtype: string
- name: filename
dtype: string
- name: rank
dtype: int32
- name: title
dtype: string
- name: url
dtype: string
- name: search_context
dtype: string
- name: answer
struct:
- name: aliases
sequence: string
- name: normalized_aliases
sequence: string
- name: matched_wiki_entity_name
dtype: string
- name: normalized_matched_wiki_entity_name
dtype: string
- name: normalized_value
dtype: string
- name: type
dtype: string
- name: value
dtype: string
splits:
- name: train
- name: validation
- name: test
download_size: 632549060
dataset_size: 0
- config_name: unfiltered.wikipedia
features:
- name: question
dtype: string
- name: question_id
dtype: string
- name: question_source
dtype: string
- name: entity_pages
sequence:
- name: doc_source
dtype: string
- name: filename
dtype: string
- name: title
dtype: string
- name: wiki_context
dtype: string
- name: search_results
sequence:
- name: description
dtype: string
- name: filename
dtype: string
- name: rank
dtype: int32
- name: title
dtype: string
- name: url
dtype: string
- name: search_context
dtype: string
- name: answer
struct:
- name: aliases
sequence: string
- name: normalized_aliases
sequence: string
- name: matched_wiki_entity_name
dtype: string
- name: normalized_matched_wiki_entity_name
dtype: string
- name: normalized_value
dtype: string
- name: type
dtype: string
- name: value
dtype: string
splits:
- name: train
- name: validation
- name: test
download_size: 3298328560
dataset_size: 0
- config_name: unfiltered.wikipedia.nocontext
features:
- name: question
dtype: string
- name: question_id
dtype: string
- name: question_source
dtype: string
- name: entity_pages
sequence:
- name: doc_source
dtype: string
- name: filename
dtype: string
- name: title
dtype: string
- name: wiki_context
dtype: string
- name: search_results
sequence:
- name: description
dtype: string
- name: filename
dtype: string
- name: rank
dtype: int32
- name: title
dtype: string
- name: url
dtype: string
- name: search_context
dtype: string
- name: answer
struct:
- name: aliases
sequence: string
- name: normalized_aliases
sequence: string
- name: matched_wiki_entity_name
dtype: string
- name: normalized_matched_wiki_entity_name
dtype: string
- name: normalized_value
dtype: string
- name: type
dtype: string
- name: value
dtype: string
splits:
- name: train
- name: validation
- name: test
download_size: 632549060
dataset_size: 0
configs:
- config_name: rc
data_files:
- split: train
path: rc/train-*
- split: validation
path: rc/validation-*
- split: test
path: rc/test-*
- config_name: rc.nocontext
data_files:
- split: train
path: rc.nocontext/train-*
- split: validation
path: rc.nocontext/validation-*
- split: test
path: rc.nocontext/test-*
- config_name: rc.web
data_files:
- split: train
path: rc.web/train-*
- split: validation
path: rc.web/validation-*
- split: test
path: rc.web/test-*
- config_name: rc.web.nocontext
data_files:
- split: train
path: rc.web.nocontext/train-*
- split: validation
path: rc.web.nocontext/validation-*
- split: test
path: rc.web.nocontext/test-*
- config_name: rc.wikipedia
data_files:
- split: train
path: rc.wikipedia/train-*
- split: validation
path: rc.wikipedia/validation-*
- split: test
path: rc.wikipedia/test-*
- config_name: rc.wikipedia.nocontext
data_files:
- split: train
path: rc.wikipedia.nocontext/train-*
- split: validation
path: rc.wikipedia.nocontext/validation-*
- split: test
path: rc.wikipedia.nocontext/test-*
- config_name: unfiltered
data_files:
- split: train
path: unfiltered/train-*
- split: validation
path: unfiltered/validation-*
- split: test
path: unfiltered/test-*
- config_name: unfiltered.nocontext
data_files:
- split: train
path: unfiltered.nocontext/train-*
- split: validation
path: unfiltered.nocontext/validation-*
- split: test
path: unfiltered.nocontext/test-*
---
# Dataset Card for "trivia_qa"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [http://nlp.cs.washington.edu/triviaqa/](http://nlp.cs.washington.edu/triviaqa/)
- **Repository:** [https://github.com/mandarjoshi90/triviaqa](https://github.com/mandarjoshi90/triviaqa)
- **Paper:** [TriviaQA: A Large Scale Distantly Supervised Challenge Dataset for Reading Comprehension](https://arxiv.org/abs/1705.03551)
- **Leaderboard:** [CodaLab Leaderboard](https://competitions.codalab.org/competitions/17208#results)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 9.26 GB
- **Size of the generated dataset:** 45.46 GB
- **Total amount of disk used:** 54.72 GB
### Dataset Summary
TriviaqQA is a reading comprehension dataset containing over 650K
question-answer-evidence triples. TriviaqQA includes 95K question-answer
pairs authored by trivia enthusiasts and independently gathered evidence
documents, six per question on average, that provide high quality distant
supervision for answering the questions.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
English.
## Dataset Structure
### Data Instances
#### rc
- **Size of downloaded dataset files:** 2.67 GB
- **Size of the generated dataset:** 16.02 GB
- **Total amount of disk used:** 18.68 GB
An example of 'train' looks as follows.
```
```
#### rc.nocontext
- **Size of downloaded dataset files:** 2.67 GB
- **Size of the generated dataset:** 126.27 MB
- **Total amount of disk used:** 2.79 GB
An example of 'train' looks as follows.
```
```
#### unfiltered
- **Size of downloaded dataset files:** 3.30 GB
- **Size of the generated dataset:** 29.24 GB
- **Total amount of disk used:** 32.54 GB
An example of 'validation' looks as follows.
```
```
#### unfiltered.nocontext
- **Size of downloaded dataset files:** 632.55 MB
- **Size of the generated dataset:** 74.56 MB
- **Total amount of disk used:** 707.11 MB
An example of 'train' looks as follows.
```
```
### Data Fields
The data fields are the same among all splits.
#### rc
- `question`: a `string` feature.
- `question_id`: a `string` feature.
- `question_source`: a `string` feature.
- `entity_pages`: a dictionary feature containing:
- `doc_source`: a `string` feature.
- `filename`: a `string` feature.
- `title`: a `string` feature.
- `wiki_context`: a `string` feature.
- `search_results`: a dictionary feature containing:
- `description`: a `string` feature.
- `filename`: a `string` feature.
- `rank`: a `int32` feature.
- `title`: a `string` feature.
- `url`: a `string` feature.
- `search_context`: a `string` feature.
- `aliases`: a `list` of `string` features.
- `normalized_aliases`: a `list` of `string` features.
- `matched_wiki_entity_name`: a `string` feature.
- `normalized_matched_wiki_entity_name`: a `string` feature.
- `normalized_value`: a `string` feature.
- `type`: a `string` feature.
- `value`: a `string` feature.
#### rc.nocontext
- `question`: a `string` feature.
- `question_id`: a `string` feature.
- `question_source`: a `string` feature.
- `entity_pages`: a dictionary feature containing:
- `doc_source`: a `string` feature.
- `filename`: a `string` feature.
- `title`: a `string` feature.
- `wiki_context`: a `string` feature.
- `search_results`: a dictionary feature containing:
- `description`: a `string` feature.
- `filename`: a `string` feature.
- `rank`: a `int32` feature.
- `title`: a `string` feature.
- `url`: a `string` feature.
- `search_context`: a `string` feature.
- `aliases`: a `list` of `string` features.
- `normalized_aliases`: a `list` of `string` features.
- `matched_wiki_entity_name`: a `string` feature.
- `normalized_matched_wiki_entity_name`: a `string` feature.
- `normalized_value`: a `string` feature.
- `type`: a `string` feature.
- `value`: a `string` feature.
#### unfiltered
- `question`: a `string` feature.
- `question_id`: a `string` feature.
- `question_source`: a `string` feature.
- `entity_pages`: a dictionary feature containing:
- `doc_source`: a `string` feature.
- `filename`: a `string` feature.
- `title`: a `string` feature.
- `wiki_context`: a `string` feature.
- `search_results`: a dictionary feature containing:
- `description`: a `string` feature.
- `filename`: a `string` feature.
- `rank`: a `int32` feature.
- `title`: a `string` feature.
- `url`: a `string` feature.
- `search_context`: a `string` feature.
- `aliases`: a `list` of `string` features.
- `normalized_aliases`: a `list` of `string` features.
- `matched_wiki_entity_name`: a `string` feature.
- `normalized_matched_wiki_entity_name`: a `string` feature.
- `normalized_value`: a `string` feature.
- `type`: a `string` feature.
- `value`: a `string` feature.
#### unfiltered.nocontext
- `question`: a `string` feature.
- `question_id`: a `string` feature.
- `question_source`: a `string` feature.
- `entity_pages`: a dictionary feature containing:
- `doc_source`: a `string` feature.
- `filename`: a `string` feature.
- `title`: a `string` feature.
- `wiki_context`: a `string` feature.
- `search_results`: a dictionary feature containing:
- `description`: a `string` feature.
- `filename`: a `string` feature.
- `rank`: a `int32` feature.
- `title`: a `string` feature.
- `url`: a `string` feature.
- `search_context`: a `string` feature.
- `aliases`: a `list` of `string` features.
- `normalized_aliases`: a `list` of `string` features.
- `matched_wiki_entity_name`: a `string` feature.
- `normalized_matched_wiki_entity_name`: a `string` feature.
- `normalized_value`: a `string` feature.
- `type`: a `string` feature.
- `value`: a `string` feature.
### Data Splits
| name |train |validation|test |
|--------------------|-----:|---------:|----:|
|rc |138384| 18669|17210|
|rc.nocontext |138384| 18669|17210|
|unfiltered | 87622| 11313|10832|
|unfiltered.nocontext| 87622| 11313|10832|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
The University of Washington does not own the copyright of the questions and documents included in TriviaQA.
### Citation Information
```
@article{2017arXivtriviaqa,
author = {{Joshi}, Mandar and {Choi}, Eunsol and {Weld},
Daniel and {Zettlemoyer}, Luke},
title = "{triviaqa: A Large Scale Distantly Supervised Challenge Dataset for Reading Comprehension}",
journal = {arXiv e-prints},
year = 2017,
eid = {arXiv:1705.03551},
pages = {arXiv:1705.03551},
archivePrefix = {arXiv},
eprint = {1705.03551},
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@patrickvonplaten](https://github.com/patrickvonplaten), [@lewtun](https://github.com/lewtun) for adding this dataset. |
Gabriel1322/jutalo | ---
license: openrail
---
|
CyberHarem/kuroe_puellamagimadokamagicasidestorymagiarecord | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kuroe
This is the dataset of Kuroe, containing 150 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 150 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 321 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 150 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 150 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 150 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 150 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 150 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 321 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 321 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 321 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
autoevaluate/autoeval-staging-eval-glue-cola-42256f-15426136 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: binary_classification
model: navsad/navid_test_bert
metrics: []
dataset_name: glue
dataset_config: cola
dataset_split: validation
col_mapping:
text: sentence
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Binary Text Classification
* Model: navsad/navid_test_bert
* Dataset: glue
* Config: cola
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@yooo](https://huggingface.co/yooo) for evaluating this model. |
khoomeik/satscale-ca-300 | ---
dataset_info:
features:
- name: name
dtype: string
- name: n_vars
dtype: int64
- name: n_clauses
dtype: int64
- name: clauses
sequence:
sequence: int64
- name: marginals
sequence: float64
- name: assignments
sequence: int64
splits:
- name: train
num_bytes: 960486
num_examples: 300
- name: valid
num_bytes: 337034
num_examples: 100
- name: test
num_bytes: 306382
num_examples: 100
download_size: 210450
dataset_size: 1603902
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
q21/embeddings1_autogpt | ---
license: mit
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_56 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1384794860.0
num_examples: 271955
download_size: 1415484511
dataset_size: 1384794860.0
---
# Dataset Card for "chunk_56"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sz4qwe/1 | ---
license: afl-3.0
---
|
hails/agieval-sat-en-without-passage | ---
dataset_info:
features:
- name: query
dtype: string
- name: choices
sequence: string
- name: gold
sequence: int64
splits:
- name: test
num_bytes: 155279
num_examples: 206
download_size: 85336
dataset_size: 155279
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "agieval-sat-en-without-passage"
Dataset taken from https://github.com/microsoft/AGIEval and processed as in that repo, following dmayhem93/agieval-* datasets on the HF hub.
This dataset contains the contents of the SAT-En-without-passage subtask of AGIEval, as accessed in https://github.com/ruixiangcui/AGIEval/commit/5c77d073fda993f1652eaae3cf5d04cc5fd21d40 .
Citation:
@misc
{zhong2023agieval,
title={AGIEval: A Human-Centric Benchmark for Evaluating Foundation Models},
author={Wanjun Zhong and Ruixiang Cui and Yiduo Guo and Yaobo Liang and Shuai Lu and Yanlin Wang and Amin Saied and Weizhu Chen and Nan Duan},
year={2023},
eprint={2304.06364},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
|
CyberHarem/uzumaki_kushina_naruto | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of uzumaki_kushina (NARUTO)
This is the dataset of uzumaki_kushina (NARUTO), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
KarlGauss/paisa_corpus_conll | ---
license: cc-by-nc-sa-3.0
task_categories:
- token-classification
language:
- it
--- |
joaozaina/minhavoz | ---
license: openrail
---
|
thesven/bengali-ai-train-set-medium | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: valid
num_bytes: 1442031336
num_examples: 1500
- name: train
num_bytes: 28836567160
num_examples: 30000
download_size: 4742613297
dataset_size: 30278598496
---
# Dataset Card for "bengali-ai-train-set-medium"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lillybak/llme2_sft_dataset_rlaif | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 5573
num_examples: 5
download_size: 10626
dataset_size: 5573
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
solarplasma/coframe | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: image_mask
dtype: image
splits:
- name: train
num_bytes: 1753383342.648
num_examples: 1052
download_size: 1792896752
dataset_size: 1753383342.648
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zolak/twitter_dataset_79_1713090986 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2662765
num_examples: 6532
download_size: 1349882
dataset_size: 2662765
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tmfi/jawiki-20230911 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 8129791520
num_examples: 1386531
download_size: 3964405981
dataset_size: 8129791520
---
# Dataset Card for "jawiki-20230911"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.1 | ---
pretty_name: Evaluation run of jondurbin/airoboros-7b-gpt4-1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-7b-gpt4-1.1](https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T13:09:52.806111](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.1/blob/main/results_2023-10-22T13-09-52.806111.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19798657718120805,\n\
\ \"em_stderr\": 0.00408082849939278,\n \"f1\": 0.2537437080536912,\n\
\ \"f1_stderr\": 0.004098830726202191,\n \"acc\": 0.38097222729184826,\n\
\ \"acc_stderr\": 0.008622604334831044\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.19798657718120805,\n \"em_stderr\": 0.00408082849939278,\n\
\ \"f1\": 0.2537437080536912,\n \"f1_stderr\": 0.004098830726202191\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0310841546626232,\n \
\ \"acc_stderr\": 0.004780296718393349\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7308602999210734,\n \"acc_stderr\": 0.012464911951268738\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|arc:challenge|25_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T13_09_52.806111
path:
- '**/details_harness|drop|3_2023-10-22T13-09-52.806111.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T13-09-52.806111.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T13_09_52.806111
path:
- '**/details_harness|gsm8k|5_2023-10-22T13-09-52.806111.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T13-09-52.806111.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hellaswag|10_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T13:46:19.144094.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T13:46:19.144094.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T13:46:19.144094.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T13_09_52.806111
path:
- '**/details_harness|winogrande|5_2023-10-22T13-09-52.806111.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T13-09-52.806111.parquet'
- config_name: results
data_files:
- split: 2023_07_31T13_46_19.144094
path:
- results_2023-07-31T13:46:19.144094.parquet
- split: 2023_10_22T13_09_52.806111
path:
- results_2023-10-22T13-09-52.806111.parquet
- split: latest
path:
- results_2023-10-22T13-09-52.806111.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-7b-gpt4-1.1](https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T13:09:52.806111](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.1/blob/main/results_2023-10-22T13-09-52.806111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.19798657718120805,
"em_stderr": 0.00408082849939278,
"f1": 0.2537437080536912,
"f1_stderr": 0.004098830726202191,
"acc": 0.38097222729184826,
"acc_stderr": 0.008622604334831044
},
"harness|drop|3": {
"em": 0.19798657718120805,
"em_stderr": 0.00408082849939278,
"f1": 0.2537437080536912,
"f1_stderr": 0.004098830726202191
},
"harness|gsm8k|5": {
"acc": 0.0310841546626232,
"acc_stderr": 0.004780296718393349
},
"harness|winogrande|5": {
"acc": 0.7308602999210734,
"acc_stderr": 0.012464911951268738
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AtlasUnified/Code-Instruct-Sets | ---
license: mit
---
|
pharaouk/glaive-function-calling-v2 | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
size_categories:
- 100K<n<1M
--- |
lorinma/PetrochemicalCorpora_CPTtest_200bks_zh | ---
task_categories:
- text-generation
language:
- zh
size_categories:
- 10K<n<100K
---
Chinese Corpora in the field of petrochemical, for the purpose of LLM continue-pretrain.
用于垂域(化工)LLM的增量预训练使用的语料,测试版。
200本书,仅经过了OCR,没有进行任何数据清理,所以质量不高。尤其是涉及到复杂的表格和公式,以及这批书的扫描质量偏低。
仅用于测试使用。
样例1:
```
i所有安全泄压设施:如安全阀、爆破片、呼吸阀都应编号,并表示清楚设计要求; j异径管需注明其形式及规格;对改、扩建装置,版表示与已有设备或管道的连接点 (3) 仪表 a所有在线仪表,包括测量、记录、调节、分析仪表等,所有仪表均需编号; b所有调节阀; e联锁关系; d 随机仪表应在PID上注明。 (4) PID注释 h设备注释主要注明设备布置的特殊要求和催化剂、化学品和填料装卸处的空间要求 等内容; b管道注释主要注明工艺、配管方面的一些特殊要求; c仪表注释主要注明仪表安装方面的特殊要求。 3.0.10公用系统管道和仪表流程图应表示下列内容: (1) 与公用系统有关,即使用或产生公用物料的设备(包括备用设备); (2) 公用物料干管、总管、支管和进出设备的所有公用物料管道、管件、阀件等,并作 管道标注; (3) 公用物料管道上的所有仪表,但在丄艺管道及仪表流程图上已表示的公用物料仪表 不得重复出现。 4设 备 4.1设备设计说明 4.1.1设备(包括容器、换地器、工业炉、机泵、机械)设计说明一般应有;为厂解
```
样例2:
```
严格控制环氧乙烷在空气中的最高浓度不樗超过0- 001g/m\ 在处理环氧乙烷时,操作人员应该配戴防护眼镜、橡皮手套、围裙及橡胶靴。 中毒患者要立即脱离现场,衣服污染立即更換,皮肤污染立即用温水清洗。 ⑵乙二醇 乙二醇在常温下是无色透明粘稠液体,通过口腔侵入人体有明显的中毒作 用,误饮30〜50ml引起轻微中毒f50-200mI引起急性中毒,200~400ml可以致死. 长期慢性中毒会引起眼球震颤、食欲减退、膚睡,以及反复发作性神志模糊。 为防止乙二静中毒,在生产中要消灭跑、冒、滴、漏.进入塔或容器中检修前,必须倒空乙二 醇物料,用氧气和空气置换合格后,方可入内,必要时戴好防毒面具工作,在出料时要戴防护手 套,不要使皮肤长期与之接鮭。 严禁品尝或饮用乙二醇及其水溶液。 (3) 二氯乙烷D二氣乙烷是麻醉剂,主要侵害内脏和神经系统,也能通过皮肤吸入中 毒,对人致死最为100g左右,15〜40g可能引起急性中毒。 急性中毒表现为颈与头痛,嗜睡、恶心、呕吐,眼、鼻、咽喉粘膜轻度刺激,面部发红。严重 者全身无力、眩晕、剧烈呕吐,上腹部疼痛,肝脏常肿大,心悸,成压增高,极度严重者可以谚妄,
``` |
CVasNLPExperiments/Hatefulmemes_test_google_flan_t5_xxl_mode_A_T_OCR_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__text
num_bytes: 1014097
num_examples: 1000
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full__text
num_bytes: 1014097
num_examples: 1000
- name: fewshot_0
num_bytes: 1052743
num_examples: 1000
download_size: 547922
dataset_size: 3080937
---
# Dataset Card for "Hatefulmemes_test_google_flan_t5_xxl_mode_A_T_OCR_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Iliasselyaa/CMUBookSummaryDataset | ---
license: cc
---
|
Lancelot53/xlsum | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: image_paths
sequence: string
splits:
- name: train
num_bytes: 982097374
num_examples: 306522
- name: test
num_bytes: 35146245.0
num_examples: 11535
- name: validation
num_bytes: 35382527.0
num_examples: 11535
download_size: 648046091
dataset_size: 1052626146.0
---
# Dataset Card for "xlsum"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Lit4pCol4b/sidewalk-imagery-clone | ---
task_categories:
- image-segmentation
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 172437921.0
num_examples: 20
download_size: 14699473
dataset_size: 172437921.0
---
# Dataset card for sidewalk-imagery-clone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset description](#dataset-description)
- [Dataset categories](#dataset-categories)
## Dataset description
- **Homepage:** https://segments.ai/Lit4pCol4b/sidewalk-imagery-clone
This dataset was created using [Segments.ai](https://segments.ai). It can be found [here](https://segments.ai/Lit4pCol4b/sidewalk-imagery-clone).
## Dataset categories
| Id | Name | Description |
| --- | ---- | ----------- |
| 1 | flat-road | - |
| 2 | flat-sidewalk | - |
| 3 | flat-crosswalk | - |
| 4 | flat-cyclinglane | - |
| 5 | flat-parkingdriveway | - |
| 6 | flat-railtrack | - |
| 7 | flat-curb | - |
| 8 | human-person | - |
| 9 | human-rider | - |
| 10 | vehicle-car | - |
| 11 | vehicle-truck | - |
| 12 | vehicle-bus | - |
| 13 | vehicle-tramtrain | - |
| 14 | vehicle-motorcycle | - |
| 15 | vehicle-bicycle | - |
| 16 | vehicle-caravan | - |
| 17 | vehicle-cartrailer | - |
| 18 | construction-building | - |
| 19 | construction-door | - |
| 20 | construction-wall | - |
| 21 | construction-fenceguardrail | - |
| 22 | construction-bridge | - |
| 23 | construction-tunnel | - |
| 24 | construction-stairs | - |
| 25 | object-pole | - |
| 26 | object-trafficsign | - |
| 27 | object-trafficlight | - |
| 28 | nature-vegetation | - |
| 29 | nature-terrain | - |
| 30 | sky | - |
| 31 | void-ground | - |
| 32 | void-dynamic | - |
| 33 | void-static | - |
| 34 | void-unclear | - |
|
MaxReynolds/TestUpload | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1216137.0
num_examples: 10
download_size: 1217696
dataset_size: 1216137.0
---
# Dataset Card for "TestUpload"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cyanic-selkie/wikianc-hr | ---
license: cc-by-sa-3.0
task_categories:
- token-classification
language:
- hr
tags:
- wikidata
- wikipedia
- wikification
pretty_name: WikiAnc HR
size_categories:
- 1M<n<10M
---
# Dataset Card for WikiAnc HR
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
## Dataset Description
- **Repository:** [WikiAnc repository](https://github.com/cyanic-selkie/wikianc)
### Dataset Summary
The WikiAnc HR datasets is an automatically generated dataset from Wikipedia (hr) and Wikidata dumps (March 1, 2023).
The code for generating the dataset can be found [here](https://github.com/cyanic-selkie/wikianc).
### Supported Tasks
- `wikificiation`: The dataset can be used to train a model for Wikification.
### Languages
The text in the dataset is in Croatian. The associated BCP-47 code is `hr`.
You can find the English version [here](https://huggingface.co/datasets/cyanic-selkie/wikianc-en).
## Dataset Structure
### Data Instances
A typical data point represents a paragraph in a Wikipedia article.
The `paragraph_text` field contains the original text in an NFC normalized, UTF-8 encoded string.
The `paragraph_anchors` field contains a list of anchors, each represented by a struct with the inclusive starting UTF-8 code point `start` field, exclusive ending UTF-8 code point `end` field, a nullable `qid` field, a nullable `pageid` field, and an NFC normalized, UTF-8 encoded `title` (Wikipedia) field.
Additionally, each paragraph has `article_title`, `article_pageid`, and (nullable) `article_qid` fields referring to the article the paragraph came from.
There is also a nullable, NFC normalized, UTF-8 encoded `section_heading` field, and an integer `section_level` field referring to the heading (if it exists) of the article section, and the level in the section hierarchy that the paragraph came from.
The `qid` fields refers to Wikidata's QID identifiers, while the `pageid` and `title` fields refer to Wikipedia's pageID and title identifiers (there is a one-to-one mapping between pageIDs and titles).
**NOTE:** An anchor will always have a `title`, but that doesn't mean it has to have a `pageid`. This is because Wikipedia allows defining anchors to nonexistent articles.
An example from the WikiAnc HR test set looks as follows:
```
{
"uuid": "8a9569ea-a398-4d14-8bce-76c263a8c0ac",
"article_title": "Špiro_Dmitrović",
"article_pageid": 70957,
"article_qid": 16116278,
"section_heading": null,
"section_level": 0,
"paragraph_text": "Špiro Dmitrović (Benkovac, 1803. – Zagreb, 6. veljače 1868.) hrvatski časnik i politički borac u doba ilirizma.",
"paragraph_anchors": [
{
"start": 17,
"end": 25,
"qid": 397443,
"pageid": 14426,
"title": "Benkovac"
},
{
"start": 27,
"end": 32,
"qid": 6887,
"pageid": 1876,
"title": "1803."
},
{
"start": 35,
"end": 41,
"qid": 1435,
"pageid": 5903,
"title": "Zagreb"
},
{
"start": 43,
"end": 53,
"qid": 2320,
"pageid": 496,
"title": "6._veljače"
},
{
"start": 54,
"end": 59,
"qid": 7717,
"pageid": 1811,
"title": "1868."
},
{
"start": 102,
"end": 110,
"qid": 680821,
"pageid": 54622,
"title": "Ilirizam"
}
]
}
```
### Data Fields
- `uuid`: a UTF-8 encoded string representing a v4 UUID that uniquely identifies the example
- `article_title`: an NFC normalized, UTF-8 encoded Wikipedia title of the article; spaces are replaced with underscores
- `article_pageid`: an integer representing the Wikipedia pageID of the article
- `article_qid`: an integer representing the Wikidata QID this article refers to; it can be null if the entity didn't exist in Wikidata at the time of the creation of the original dataset
- `section_heading`: a nullable, NFC normalized, UTF-8 encoded string representing the section heading
- `section_level`: an integer representing the level of the section in the section hierarchy
- `paragraph_text`: an NFC normalized, UTF-8 encoded string representing the paragraph
- `paragraph_anchors`: a list of structs representing anchors, each anchor has:
- `start`: an integer representing the inclusive starting UTF-8 code point of the anchors
- `end`: an integer representing the exclusive ending UTF-8 code point of the anchor
- `qid`: a nullable integer representing the Wikidata QID this anchor refers to; it can be null if the entity didn't exist in Wikidata at the time of the creation of the original dataset
- `pageid`: a nullable integer representing the Wikipedia pageID of the anchor; it can be null if the article didn't exist in Wikipedia at the time of the creation of the original dataset
- `title`: an NFC normalized, UTF-8 encoded string representing the Wikipedia title of the anchor; spaces are replaced with underscores; can refer to a nonexistent Wikipedia article
### Data Splits
The data is split into training, validation and test sets; paragraphs belonging to the same article aren't necessarily in the same split. The final split sizes are as follows:
| | Train | Validation | Test |
| :----- | :------: | :-----: | :----: |
| WikiAnc HR - articles | 192,653 | 116,375 | 116,638 |
| WikiAnc HR - paragraphs | 2,346,651 | 292,590 | 293,557 |
| WikiAnc HR - anchors | 8,368,928 | 1,039,851 | 1,044,828 |
| WikiAnc HR - anchors with QIDs | 7,160,367 | 891,959 | 896,414 |
| WikiAnc HR - anchors with pageIDs | 7,179,116 | 894,313 | 898,692 |
**NOTE:** The number of articles in the table above refers to the number of articles that have at least one paragraph belonging to the article appear in the split.
## Additional Information
### Licensing Information
The WikiAnc HR dataset is given under the [Creative Commons Attribution 4.0 International](https://creativecommons.org/licenses/by/4.0/) license. |
codys12/Pathway | ---
license: apache-2.0
---
|
uname-n/slim-orca-dedup-chat-50k | ---
dataset_info:
features:
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 86419352
num_examples: 50000
download_size: 46378339
dataset_size: 86419352
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SanFelicio/AcademyDataset | ---
task_categories:
- text-generation
size_categories:
- 1K<n<10K
--- |
LxYxvv/quora_qa | ---
license: mit
task_categories:
- question-answering
dataset_info:
features:
- name: qid
dtype: int64
- name: url
dtype: string
- name: title
dtype: string
- name: creationTime
dtype: int64
- name: followerCount
dtype: int64
- name: viewCount
dtype: int64
- name: numAnswers
dtype: int64
- name: numMachineAnswers
dtype: int64
- name: isLocked
dtype: bool
- name: isTrendyQuestion
dtype: bool
- name: asker
struct:
- name: uid
dtype: int64
- name: givenName
dtype: string
- name: familyName
dtype: string
- name: isMachineAnswerBot
dtype: bool
- name: answers
dtype: string
splits:
- name: train
num_bytes: 45900481870
num_examples: 2971156
download_size: 16052225826
dataset_size: 45900481870
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# QUORA_ONE_MANY_QA
This dataset is derived from **quora.com** questioning data. It is a question with multiple answers.
The project provide gas for [mnbvc](http://mnbvc.253874.net/).
# STATISTICS
Raw data size
- 1000000 16G
- 2000000 17G
- 3000000 15G
- Updating... |
TheLZen/stablediffusion | ---
license: cc-by-sa-4.0
---
|
Leofierus/Drone-Dataset | ---
license: mit
---
The given dataset is a clone of the [drone dataset](https://www.kaggle.com/datasets/dasmehdixtr/drone-dataset-uav) on Kaggle.
It is created by [Mehdi Özel](https://www.researchgate.net/profile/Mehdi-Oezel). |
marcob/lambada_multilingual | ---
pretty_name: LAMBADA OpenAI
language_creators:
- machine-generated
license: mit
multilinguality:
- translation
task_ids:
- language-modeling
source_datasets:
- lambada
size_categories:
- 1K<n<10K
language:
- de
- en
- es
- fr
- it
dataset_info:
- config_name: default
features:
- name: text
dtype: string
splits:
- name: test
num_examples: 5153
- config_name: en
features:
- name: text
dtype: string
splits:
- name: test
num_examples: 5153
- config_name: it
features:
- name: text
dtype: string
splits:
- name: test
num_examples: 5153
---
|
abhishek/hagrid | ---
license:
- cc-by-sa-4.0
kaggle_id: kapitanov/hagrid
---
# Dataset Card for HaGRID - HAnd Gesture Recognition Image Dataset
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://kaggle.com/datasets/kapitanov/hagrid
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary

We introduce a large image dataset **HaGRID** (**HA**nd **G**esture **R**ecognition **I**mage **D**ataset) for hand gesture recognition (HGR) systems. You can use it for image classification or image detection tasks. Proposed dataset allows to build HGR systems, which can be used in video conferencing services (Zoom, Skype, Discord, Jazz etc.), home automation systems, the automotive sector, etc.
**HaGRID** size is **716GB** and dataset contains **552,992 FullHD** (1920 × 1080) RGB images divided into **18** classes of gestures. Also, some images have `no_gesture` class if there is a second free hand in the frame. This extra class contains **123,589** samples. The data were split into training **92%**, and testing **8%** sets by subject **user-id**, with **509,323** images for train and 43,669 images for test.

The dataset contains **34,730** unique persons and at least this number of unique scenes. The subjects are people from 18 to 65 years old. The dataset was collected mainly indoors with considerable variation in lighting, including artificial and natural light. Besides, the dataset includes images taken in extreme conditions such as facing and backing to a window. Also, the subjects had to show gestures at a distance of 0.5 to 4 meters from the camera.
## Annotations
The annotations consist of bounding boxes of hands with gesture labels in COCO format `[top left X position, top left Y position, width, height]`. Also annotations have markups of `leading hands` (`left` of `right` for gesture hand) and `leading_conf` as confidence for `leading_hand` annotation. We provide `user_id` field that will allow you to split the train / val dataset yourself.
```json
"03487280-224f-490d-8e36-6c5f48e3d7a0": {
"bboxes": [
[0.0283366, 0.8686061, 0.0757000, 0.1149820],
[0.6824319, 0.2661254, 0.1086447, 0.1481245]
],
"labels": [
"no_gesture",
"one"
],
"leading_hand": "left",
"leading_conf": 1.0,
"user_id": "bb138d5db200f29385f..."
}
```
## Downloads
We split the train dataset into 18 archives by gestures because of the large size of data. Download and unzip them from the following links:
### Trainval
| Gesture | Size | Gesture | Size |
|-----------------------------------|----------|-------------------------------------------|---------|
| [`call`](https://sc.link/ykEn) | 39.1 GB | [`peace`](https://sc.link/l6nM) | 38.6 GB |
| [`dislike`](https://sc.link/xjDB) | 38.7 GB | [`peace_inverted`](https://sc.link/mXoG) | 38.6 GB |
| [`fist`](https://sc.link/wgB8) | 38.0 GB | [`rock`](https://sc.link/kMm6) | 38.9 GB |
| [`four`](https://sc.link/vJA5) | 40.5 GB | [`stop`](https://sc.link/gXgk) | 38.3 GB |
| [`like`](https://sc.link/r7wp) | 38.3 GB | [`stop_inverted`](https://sc.link/jJlv) | 40.2 GB |
| [`mute`](https://sc.link/q8vp) | 39.5 GB | [`three`](https://sc.link/wgBr) | 39.4 GB |
| [`ok`](https://sc.link/pV0V) | 39.0 GB | [`three2`](https://sc.link/vJA8) | 38.5 GB |
| [`one`](https://sc.link/oJqX) | 39.9 GB | [`two_up`](https://sc.link/q8v7) | 41.2 GB |
| [`palm`](https://sc.link/nJp7) | 39.3 GB | [`two_up_inverted`](https://sc.link/r7w2) | 39.2 GB |
`train_val` **annotations**: [`ann_train_val`](https://sc.link/BE5Y)
### Test
| Test | Archives | Size |
|-------------|-------------------------------------|-----------|
| images | [`test`](https://sc.link/zlGy) | 60.4 GB |
| annotations | [`ann_test`](https://sc.link/DE5K) | 3.4 MB |
### Subsample
Subsample has 100 items per gesture.
| Subsample | Archives | Size |
|-------------|-----------------------------------------|-----------|
| images | [`subsample`](https://sc.link/AO5l) | 2.5 GB |
| annotations | [`ann_subsample`](https://sc.link/EQ5g) | 153.8 KB |
## Models
We provide some pre-trained classifiers and one detector as baselines.
| Classifiers | F1 Gesture | F1 Leading hand |
|-------------------------------------------|------------|-----------------|
| [ResNet18](https://sc.link/KEnx) | 98.72 | 99.27 |
| [ResNet152](https://sc.link/O9rr) | 99.11 | **99.45** |
| [ResNeXt50](https://sc.link/GKjJ) | 98.99 | 99.39 |
| [ResNeXt101](https://sc.link/JXmg) | **99.28** | 99.28 |
| [MobileNetV3-small](https://sc.link/XVEg) | 96.78 | 98.28 |
| [MobileNetV3-large](https://sc.link/YXG2) | 97.88 | 98.58 |
| [VitB-32](https://sc.link/XV4g) | 98.49 | 99.13 |
| Detector | mAP |
|---------------------------------|-------|
| [SSDLite](https://sc.link/YXg2) | 71.49 |
## Links
- [Github](https://github.com/hukenovs/hagrid), [Mirror](https://gitlab.aicloud.sbercloud.ru/rndcv/hagrid)
- [arXiv](https://arxiv.org/abs/2206.08219)
- [Paperswithcode](https://paperswithcode.com/paper/hagrid-hand-gesture-recognition-image-dataset)
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
This dataset was shared by [@kapitanov](https://kaggle.com/kapitanov)
### Licensing Information
The license for this dataset is cc-by-sa-4.0
### Citation Information
```bibtex
[More Information Needed]
```
### Contributions
[More Information Needed] |
CortexLM/dalle-2-dataset | ---
license: unknown
---
|
CyberHarem/anisphia_wynn_palettia_tenseioujototensaireijounomahoukakumei | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Anisphia Wynn Palettia
This is the dataset of Anisphia Wynn Palettia, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 616 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 616 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 616 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 616 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
loubnabnl/code_data | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: files
list:
- name: blob_id
dtype: string
- name: directory_id
dtype: string
- name: path
dtype: string
- name: content_id
dtype: string
- name: detected_licenses
sequence: string
- name: license_type
dtype: string
- name: content
dtype: string
- name: src_encoding
dtype: string
- name: language
dtype: string
- name: is_vendor
dtype: bool
- name: is_generated
dtype: bool
- name: num_files
dtype: int64
splits:
- name: train
num_bytes: 424803414
num_examples: 1002
download_size: 139210535
dataset_size: 424803414
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_qqp_double_comparative | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 219481
num_examples: 1220
- name: test
num_bytes: 2499840
num_examples: 13800
- name: train
num_bytes: 2006391
num_examples: 11047
download_size: 2866930
dataset_size: 4725712
---
# Dataset Card for "MULTI_VALUE_qqp_double_comparative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Piyush2512/melold | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 420553105.75
num_examples: 7442
download_size: 420368447
dataset_size: 420553105.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
krishi/clothing | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 41259319.0
num_examples: 20
download_size: 41261925
dataset_size: 41259319.0
---
# Dataset Card for "clothing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chrisgg1/keywords_verbinden4 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: label
dtype:
class_label:
names:
'0': _unknown_
'1': ja
'2': verbinden
splits:
- name: train
num_bytes: 1232890247.924
num_examples: 8046
download_size: 589530458
dataset_size: 1232890247.924
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_postbot__emailgen-pythia-410m-deduped | ---
pretty_name: Evaluation run of postbot/emailgen-pythia-410m-deduped
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [postbot/emailgen-pythia-410m-deduped](https://huggingface.co/postbot/emailgen-pythia-410m-deduped)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_postbot__emailgen-pythia-410m-deduped_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T15:24:35.622872](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__emailgen-pythia-410m-deduped_public/blob/main/results_2023-11-13T15-24-35.622872.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2739821268942055,\n\
\ \"acc_stderr\": 0.031358822799769724,\n \"acc_norm\": 0.2757926465489037,\n\
\ \"acc_norm_stderr\": 0.03219166127988676,\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.01456650696139673,\n \"mc2\": 0.3819742528315203,\n\
\ \"mc2_stderr\": 0.015246089965112817,\n \"em\": 0.00020973154362416107,\n\
\ \"em_stderr\": 0.00014829481977280738,\n \"f1\": 0.009905620805369138,\n\
\ \"f1_stderr\": 0.0005041998138971091\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2593856655290102,\n \"acc_stderr\": 0.012808273573927102,\n\
\ \"acc_norm\": 0.2790102389078498,\n \"acc_norm_stderr\": 0.013106784883601333\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.34027086237801235,\n\
\ \"acc_stderr\": 0.004728318577835236,\n \"acc_norm\": 0.4004182433778132,\n\
\ \"acc_norm_stderr\": 0.00488981748973969\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.037498507091740234,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.037498507091740234\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.33584905660377357,\n \"acc_stderr\": 0.029067220146644826,\n\
\ \"acc_norm\": 0.33584905660377357,\n \"acc_norm_stderr\": 0.029067220146644826\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.036539469694421,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.036539469694421\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001976,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001976\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n\
\ \"acc_stderr\": 0.033687629322594316,\n \"acc_norm\": 0.2658959537572254,\n\
\ \"acc_norm_stderr\": 0.033687629322594316\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.029896145682095455,\n\
\ \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.029896145682095455\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.034559302019248096,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.034559302019248096\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557836,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557836\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22903225806451613,\n\
\ \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.22903225806451613,\n\
\ \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.031447125816782426,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.031447125816782426\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3181818181818182,\n \"acc_stderr\": 0.03318477333845331,\n \"\
acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.03318477333845331\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.35751295336787564,\n \"acc_stderr\": 0.034588160421810045,\n\
\ \"acc_norm\": 0.35751295336787564,\n \"acc_norm_stderr\": 0.034588160421810045\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36153846153846153,\n \"acc_stderr\": 0.024359581465396987,\n\
\ \"acc_norm\": 0.36153846153846153,\n \"acc_norm_stderr\": 0.024359581465396987\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.03086868260412163,\n \
\ \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.03086868260412163\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.344954128440367,\n \"acc_stderr\": 0.02038060540506697,\n \"acc_norm\"\
: 0.344954128440367,\n \"acc_norm_stderr\": 0.02038060540506697\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.033622774366080424,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.033622774366080424\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.03058759135160425,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.03058759135160425\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2109704641350211,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.2109704641350211,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.12556053811659193,\n\
\ \"acc_stderr\": 0.022238985469323774,\n \"acc_norm\": 0.12556053811659193,\n\
\ \"acc_norm_stderr\": 0.022238985469323774\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.366412213740458,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.366412213740458,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.0384985609879409,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.0384985609879409\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252628,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.15178571428571427,\n\
\ \"acc_stderr\": 0.034057028381856945,\n \"acc_norm\": 0.15178571428571427,\n\
\ \"acc_norm_stderr\": 0.034057028381856945\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.36893203883495146,\n \"acc_stderr\": 0.047776151811567386,\n\
\ \"acc_norm\": 0.36893203883495146,\n \"acc_norm_stderr\": 0.047776151811567386\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.21367521367521367,\n\
\ \"acc_stderr\": 0.026853450377009154,\n \"acc_norm\": 0.21367521367521367,\n\
\ \"acc_norm_stderr\": 0.026853450377009154\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.22988505747126436,\n\
\ \"acc_stderr\": 0.015046301846691807,\n \"acc_norm\": 0.22988505747126436,\n\
\ \"acc_norm_stderr\": 0.015046301846691807\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.21098265895953758,\n \"acc_stderr\": 0.021966309947043117,\n\
\ \"acc_norm\": 0.21098265895953758,\n \"acc_norm_stderr\": 0.021966309947043117\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.014874252168095273,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.014874252168095273\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.025261691219729498,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.025261691219729498\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2347266881028939,\n\
\ \"acc_stderr\": 0.024071805887677045,\n \"acc_norm\": 0.2347266881028939,\n\
\ \"acc_norm_stderr\": 0.024071805887677045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2345679012345679,\n \"acc_stderr\": 0.023576881744005705,\n\
\ \"acc_norm\": 0.2345679012345679,\n \"acc_norm_stderr\": 0.023576881744005705\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.02551873104953776,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.02551873104953776\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25097783572359844,\n\
\ \"acc_stderr\": 0.01107373029918723,\n \"acc_norm\": 0.25097783572359844,\n\
\ \"acc_norm_stderr\": 0.01107373029918723\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.030008562845003476,\n\
\ \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.030008562845003476\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24183006535947713,\n \"acc_stderr\": 0.017322789207784326,\n \
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.017322789207784326\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.19090909090909092,\n\
\ \"acc_stderr\": 0.03764425585984926,\n \"acc_norm\": 0.19090909090909092,\n\
\ \"acc_norm_stderr\": 0.03764425585984926\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.030769444967296028,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.030769444967296028\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n\
\ \"acc_stderr\": 0.033844291552331346,\n \"acc_norm\": 0.25301204819277107,\n\
\ \"acc_norm_stderr\": 0.033844291552331346\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22276621787025705,\n\
\ \"mc1_stderr\": 0.01456650696139673,\n \"mc2\": 0.3819742528315203,\n\
\ \"mc2_stderr\": 0.015246089965112817\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5209155485398579,\n \"acc_stderr\": 0.014040185494212947\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.00020973154362416107,\n \
\ \"em_stderr\": 0.00014829481977280738,\n \"f1\": 0.009905620805369138,\n\
\ \"f1_stderr\": 0.0005041998138971091\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/postbot/emailgen-pythia-410m-deduped
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|drop|3_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-24-35.622872.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-24-35.622872.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- '**/details_harness|winogrande|5_2023-11-13T15-24-35.622872.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T15-24-35.622872.parquet'
- config_name: results
data_files:
- split: 2023_11_13T15_24_35.622872
path:
- results_2023-11-13T15-24-35.622872.parquet
- split: latest
path:
- results_2023-11-13T15-24-35.622872.parquet
---
# Dataset Card for Evaluation run of postbot/emailgen-pythia-410m-deduped
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/postbot/emailgen-pythia-410m-deduped
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [postbot/emailgen-pythia-410m-deduped](https://huggingface.co/postbot/emailgen-pythia-410m-deduped) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_postbot__emailgen-pythia-410m-deduped_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T15:24:35.622872](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__emailgen-pythia-410m-deduped_public/blob/main/results_2023-11-13T15-24-35.622872.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2739821268942055,
"acc_stderr": 0.031358822799769724,
"acc_norm": 0.2757926465489037,
"acc_norm_stderr": 0.03219166127988676,
"mc1": 0.22276621787025705,
"mc1_stderr": 0.01456650696139673,
"mc2": 0.3819742528315203,
"mc2_stderr": 0.015246089965112817,
"em": 0.00020973154362416107,
"em_stderr": 0.00014829481977280738,
"f1": 0.009905620805369138,
"f1_stderr": 0.0005041998138971091
},
"harness|arc:challenge|25": {
"acc": 0.2593856655290102,
"acc_stderr": 0.012808273573927102,
"acc_norm": 0.2790102389078498,
"acc_norm_stderr": 0.013106784883601333
},
"harness|hellaswag|10": {
"acc": 0.34027086237801235,
"acc_stderr": 0.004728318577835236,
"acc_norm": 0.4004182433778132,
"acc_norm_stderr": 0.00488981748973969
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.037498507091740234,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.037498507091740234
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.33584905660377357,
"acc_stderr": 0.029067220146644826,
"acc_norm": 0.33584905660377357,
"acc_norm_stderr": 0.029067220146644826
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.036539469694421,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.036539469694421
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001976,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001976
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.033687629322594316,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.033687629322594316
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.029896145682095455,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.029896145682095455
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.034559302019248096,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.034559302019248096
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643895,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643895
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557836,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557836
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22903225806451613,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.22903225806451613,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.031447125816782426,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.031447125816782426
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.03318477333845331,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.03318477333845331
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35751295336787564,
"acc_stderr": 0.034588160421810045,
"acc_norm": 0.35751295336787564,
"acc_norm_stderr": 0.034588160421810045
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36153846153846153,
"acc_stderr": 0.024359581465396987,
"acc_norm": 0.36153846153846153,
"acc_norm_stderr": 0.024359581465396987
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3445378151260504,
"acc_stderr": 0.03086868260412163,
"acc_norm": 0.3445378151260504,
"acc_norm_stderr": 0.03086868260412163
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.344954128440367,
"acc_stderr": 0.02038060540506697,
"acc_norm": 0.344954128440367,
"acc_norm_stderr": 0.02038060540506697
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.033622774366080424,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.033622774366080424
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2109704641350211,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.2109704641350211,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.12556053811659193,
"acc_stderr": 0.022238985469323774,
"acc_norm": 0.12556053811659193,
"acc_norm_stderr": 0.022238985469323774
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.366412213740458,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.366412213740458,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.0384985609879409,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.0384985609879409
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252628,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.15178571428571427,
"acc_stderr": 0.034057028381856945,
"acc_norm": 0.15178571428571427,
"acc_norm_stderr": 0.034057028381856945
},
"harness|hendrycksTest-management|5": {
"acc": 0.36893203883495146,
"acc_stderr": 0.047776151811567386,
"acc_norm": 0.36893203883495146,
"acc_norm_stderr": 0.047776151811567386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.21367521367521367,
"acc_stderr": 0.026853450377009154,
"acc_norm": 0.21367521367521367,
"acc_norm_stderr": 0.026853450377009154
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.22988505747126436,
"acc_stderr": 0.015046301846691807,
"acc_norm": 0.22988505747126436,
"acc_norm_stderr": 0.015046301846691807
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21098265895953758,
"acc_stderr": 0.021966309947043117,
"acc_norm": 0.21098265895953758,
"acc_norm_stderr": 0.021966309947043117
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095273,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095273
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.025261691219729498,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.025261691219729498
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2347266881028939,
"acc_stderr": 0.024071805887677045,
"acc_norm": 0.2347266881028939,
"acc_norm_stderr": 0.024071805887677045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2345679012345679,
"acc_stderr": 0.023576881744005705,
"acc_norm": 0.2345679012345679,
"acc_norm_stderr": 0.023576881744005705
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.02551873104953776,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.02551873104953776
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25097783572359844,
"acc_stderr": 0.01107373029918723,
"acc_norm": 0.25097783572359844,
"acc_norm_stderr": 0.01107373029918723
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.030008562845003476,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.030008562845003476
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.017322789207784326,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.017322789207784326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.19090909090909092,
"acc_stderr": 0.03764425585984926,
"acc_norm": 0.19090909090909092,
"acc_norm_stderr": 0.03764425585984926
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.030769444967296028,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.030769444967296028
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25301204819277107,
"acc_stderr": 0.033844291552331346,
"acc_norm": 0.25301204819277107,
"acc_norm_stderr": 0.033844291552331346
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22276621787025705,
"mc1_stderr": 0.01456650696139673,
"mc2": 0.3819742528315203,
"mc2_stderr": 0.015246089965112817
},
"harness|winogrande|5": {
"acc": 0.5209155485398579,
"acc_stderr": 0.014040185494212947
},
"harness|drop|3": {
"em": 0.00020973154362416107,
"em_stderr": 0.00014829481977280738,
"f1": 0.009905620805369138,
"f1_stderr": 0.0005041998138971091
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_stsb_present_perfect_for_past | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 73061
num_examples: 335
- name: test
num_bytes: 56849
num_examples: 256
- name: train
num_bytes: 290891
num_examples: 1347
download_size: 281337
dataset_size: 420801
---
# Dataset Card for "MULTI_VALUE_stsb_present_perfect_for_past"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AiresPucrs/compare-models | ---
dataset_info:
features:
- name: text
dtype: string
- name: sentiment
dtype: float64
splits:
- name: train
num_bytes: 163565
num_examples: 1464
download_size: 96001
dataset_size: 163565
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- text-classification
language:
- en
size_categories:
- 1K<n<10K
---
# Compare Models
## Overview
This dataset is a reduced version of [Tweets Dataset](https://huggingface.co/datasets/AiresPucrs/tweets).
Which in turn is a reduced version of the original dataset:[Crowdflower's Data for Everyone library](https://data.world/crowdflower).
This dataset contains texts from customers posted on Twitter regarding their air travel experiences,
whether they were upset, neutral, or satisfied with the trip and the airline's service.
## Dataset Details
This version contains whether the sentiment of the tweets in this set was positive, neutral, or negative.
The dataset was used in
this notebook [model_extraction_nlp](https://github.com/Nkluge-correa/TeenyTinyCastle/blob/master/ML-Adversarial/model_extraction_nlp.ipynb).
- Dataset Name: compare-models
- Language: English
- Total Size: 1,464
## Contents
The dataset consists of a data frame with the following column:
- text
- sentiment
```bash
{
"text": "usairways how is it that my flt to ewr was cancelled flightled yet flts to nyc from usairways are still flying",
"sentiment:" 0,
"text: " "jetblue do they have to depart from washington dc",
"sentiment:" 1,
"text: " "southwestair youre my early frontrunner for best airline oscars2016",
"sentiment:" 2,
}
```
## How to use
```python
from datasets import load_dataset
dataset = load_dataset('AiresPucrs/compare-models', split='train')
```
## License
This dataset is licensed under the Apache License, version 2.0. |
vladyslavar/categorized-quotes | ---
license: unknown
---
|
Coooori/sampleData_clean_564 | ---
dataset_info:
features:
- name: conversationId
dtype: int64
- name: groud_truth
dtype: string
- name: conv_history_redial
sequence: string
- name: user_input
dtype: string
- name: all_movies_redial_idx
sequence: string
- name: hist_movies_redial_idx
sequence: string
- name: answer_movies_redial_idx
sequence: string
- name: answer_movie_info
list:
- name: cinematography
sequence: string
- name: country
sequence: string
- name: director
sequence: string
- name: distributor
sequence: string
- name: editing
sequence: string
- name: genres
sequence: string
- name: imdb_id
dtype: string
- name: language
sequence: string
- name: musicComposer
sequence: string
- name: plot
dtype: string
- name: producer
sequence: string
- name: productionCompany
sequence: string
- name: rating_ele
dtype: string
- name: redial_id
dtype: int64
- name: reviews
struct:
- name: helpful_count
dtype: string
- name: imdb_id
dtype: string
- name: review_date
dtype: string
- name: review_text
dtype: string
- name: review_title
dtype: string
- name: reviewer
dtype: string
- name: reviews
sequence: 'null'
- name: total_count
dtype: string
- name: starring
sequence: string
- name: title
dtype: string
- name: writer
sequence: string
- name: year
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 2219171
num_examples: 564
download_size: 1101091
dataset_size: 2219171
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_genaicore3434__MistralLite-summ-sft-e1 | ---
pretty_name: Evaluation run of genaicore3434/MistralLite-summ-sft-e1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [genaicore3434/MistralLite-summ-sft-e1](https://huggingface.co/genaicore3434/MistralLite-summ-sft-e1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_genaicore3434__MistralLite-summ-sft-e1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T06:35:30.064229](https://huggingface.co/datasets/open-llm-leaderboard/details_genaicore3434__MistralLite-summ-sft-e1/blob/main/results_2024-01-21T06-35-30.064229.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5209243088809142,\n\
\ \"acc_stderr\": 0.034285125251915134,\n \"acc_norm\": 0.5285550597511598,\n\
\ \"acc_norm_stderr\": 0.03510432192642734,\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834555,\n \"mc2\": 0.40848131883657496,\n\
\ \"mc2_stderr\": 0.014577935602536028\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5358361774744027,\n \"acc_stderr\": 0.014573813664735718,\n\
\ \"acc_norm\": 0.575938566552901,\n \"acc_norm_stderr\": 0.0144418896274644\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6031666998605856,\n\
\ \"acc_stderr\": 0.0048824100299354415,\n \"acc_norm\": 0.8066122286397132,\n\
\ \"acc_norm_stderr\": 0.003941471781664182\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.04065771002562605,\n\
\ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.04065771002562605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.02441923496681907,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.02441923496681907\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4935483870967742,\n\
\ \"acc_stderr\": 0.02844163823354051,\n \"acc_norm\": 0.4935483870967742,\n\
\ \"acc_norm_stderr\": 0.02844163823354051\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.03459058815883231,\n\
\ \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.03459058815883231\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.03781887353205982,\n\
\ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.03781887353205982\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6414141414141414,\n \"acc_stderr\": 0.03416903640391521,\n \"\
acc_norm\": 0.6414141414141414,\n \"acc_norm_stderr\": 0.03416903640391521\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178267,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178267\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.02504919787604234,\n \
\ \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.02504919787604234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n \
\ \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5889908256880734,\n \"acc_stderr\": 0.021095050687277652,\n \"\
acc_norm\": 0.5889908256880734,\n \"acc_norm_stderr\": 0.021095050687277652\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.696078431372549,\n\
\ \"acc_stderr\": 0.032282103870378914,\n \"acc_norm\": 0.696078431372549,\n\
\ \"acc_norm_stderr\": 0.032282103870378914\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n\
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884123,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884123\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.04524596007030049,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.04524596007030049\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503947,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503947\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6709401709401709,\n\
\ \"acc_stderr\": 0.030782321577688183,\n \"acc_norm\": 0.6709401709401709,\n\
\ \"acc_norm_stderr\": 0.030782321577688183\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6628352490421456,\n\
\ \"acc_stderr\": 0.016905207420803554,\n \"acc_norm\": 0.6628352490421456,\n\
\ \"acc_norm_stderr\": 0.016905207420803554\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5924855491329479,\n \"acc_stderr\": 0.026454578146931505,\n\
\ \"acc_norm\": 0.5924855491329479,\n \"acc_norm_stderr\": 0.026454578146931505\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28268156424581004,\n\
\ \"acc_stderr\": 0.015060381730018106,\n \"acc_norm\": 0.28268156424581004,\n\
\ \"acc_norm_stderr\": 0.015060381730018106\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.545751633986928,\n \"acc_stderr\": 0.028509807802626592,\n\
\ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.028509807802626592\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.02736807824397163,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.02736807824397163\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.02762873715566877,\n\
\ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.02762873715566877\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614112,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614112\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3983050847457627,\n\
\ \"acc_stderr\": 0.012503310565166247,\n \"acc_norm\": 0.3983050847457627,\n\
\ \"acc_norm_stderr\": 0.012503310565166247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275668,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275668\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5098039215686274,\n \"acc_stderr\": 0.0202239460050743,\n \
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.0202239460050743\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.46766169154228854,\n\
\ \"acc_stderr\": 0.035281314729336065,\n \"acc_norm\": 0.46766169154228854,\n\
\ \"acc_norm_stderr\": 0.035281314729336065\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6198830409356725,\n \"acc_stderr\": 0.03722965741385539,\n\
\ \"acc_norm\": 0.6198830409356725,\n \"acc_norm_stderr\": 0.03722965741385539\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834555,\n \"mc2\": 0.40848131883657496,\n\
\ \"mc2_stderr\": 0.014577935602536028\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.011974948667702311\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07354056103108415,\n \
\ \"acc_stderr\": 0.007189835754365272\n }\n}\n```"
repo_url: https://huggingface.co/genaicore3434/MistralLite-summ-sft-e1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|arc:challenge|25_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|arc:challenge|25_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|arc:challenge|25_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|gsm8k|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|gsm8k|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|gsm8k|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hellaswag|10_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hellaswag|10_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hellaswag|10_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T06-15-57.278961.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T06-23-01.357164.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T06-35-30.064229.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T06-35-30.064229.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- '**/details_harness|winogrande|5_2024-01-21T06-15-57.278961.parquet'
- split: 2024_01_21T06_23_01.357164
path:
- '**/details_harness|winogrande|5_2024-01-21T06-23-01.357164.parquet'
- split: 2024_01_21T06_35_30.064229
path:
- '**/details_harness|winogrande|5_2024-01-21T06-35-30.064229.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T06-35-30.064229.parquet'
- config_name: results
data_files:
- split: 2024_01_21T06_15_57.278961
path:
- results_2024-01-21T06-15-57.278961.parquet
- split: 2024_01_21T06_23_01.357164
path:
- results_2024-01-21T06-23-01.357164.parquet
- split: 2024_01_21T06_35_30.064229
path:
- results_2024-01-21T06-35-30.064229.parquet
- split: latest
path:
- results_2024-01-21T06-35-30.064229.parquet
---
# Dataset Card for Evaluation run of genaicore3434/MistralLite-summ-sft-e1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [genaicore3434/MistralLite-summ-sft-e1](https://huggingface.co/genaicore3434/MistralLite-summ-sft-e1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_genaicore3434__MistralLite-summ-sft-e1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T06:35:30.064229](https://huggingface.co/datasets/open-llm-leaderboard/details_genaicore3434__MistralLite-summ-sft-e1/blob/main/results_2024-01-21T06-35-30.064229.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5209243088809142,
"acc_stderr": 0.034285125251915134,
"acc_norm": 0.5285550597511598,
"acc_norm_stderr": 0.03510432192642734,
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834555,
"mc2": 0.40848131883657496,
"mc2_stderr": 0.014577935602536028
},
"harness|arc:challenge|25": {
"acc": 0.5358361774744027,
"acc_stderr": 0.014573813664735718,
"acc_norm": 0.575938566552901,
"acc_norm_stderr": 0.0144418896274644
},
"harness|hellaswag|10": {
"acc": 0.6031666998605856,
"acc_stderr": 0.0048824100299354415,
"acc_norm": 0.8066122286397132,
"acc_norm_stderr": 0.003941471781664182
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.02441923496681907,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.02441923496681907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4935483870967742,
"acc_stderr": 0.02844163823354051,
"acc_norm": 0.4935483870967742,
"acc_norm_stderr": 0.02844163823354051
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.03459058815883231,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.03459058815883231
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.03781887353205982,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.03781887353205982
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6414141414141414,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.6414141414141414,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178267,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178267
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.02504919787604234,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.02504919787604234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5889908256880734,
"acc_stderr": 0.021095050687277652,
"acc_norm": 0.5889908256880734,
"acc_norm_stderr": 0.021095050687277652
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.032282103870378914,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.032282103870378914
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884123,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884123
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.04524596007030049,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.04524596007030049
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503947,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503947
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6709401709401709,
"acc_stderr": 0.030782321577688183,
"acc_norm": 0.6709401709401709,
"acc_norm_stderr": 0.030782321577688183
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6628352490421456,
"acc_stderr": 0.016905207420803554,
"acc_norm": 0.6628352490421456,
"acc_norm_stderr": 0.016905207420803554
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5924855491329479,
"acc_stderr": 0.026454578146931505,
"acc_norm": 0.5924855491329479,
"acc_norm_stderr": 0.026454578146931505
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28268156424581004,
"acc_stderr": 0.015060381730018106,
"acc_norm": 0.28268156424581004,
"acc_norm_stderr": 0.015060381730018106
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.028509807802626592,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.028509807802626592
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.02736807824397163,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.02736807824397163
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.02762873715566877,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.02762873715566877
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614112,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614112
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3983050847457627,
"acc_stderr": 0.012503310565166247,
"acc_norm": 0.3983050847457627,
"acc_norm_stderr": 0.012503310565166247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275668,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275668
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.0202239460050743,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.0202239460050743
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.46766169154228854,
"acc_stderr": 0.035281314729336065,
"acc_norm": 0.46766169154228854,
"acc_norm_stderr": 0.035281314729336065
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6198830409356725,
"acc_stderr": 0.03722965741385539,
"acc_norm": 0.6198830409356725,
"acc_norm_stderr": 0.03722965741385539
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834555,
"mc2": 0.40848131883657496,
"mc2_stderr": 0.014577935602536028
},
"harness|winogrande|5": {
"acc": 0.7616416732438832,
"acc_stderr": 0.011974948667702311
},
"harness|gsm8k|5": {
"acc": 0.07354056103108415,
"acc_stderr": 0.007189835754365272
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Nisheethjaiswal/demo_dataset | ---
license: openrail
---
|
marcones/marconesofertas | ---
license: openrail
---
|
nguyenthanhdo/orca-unanswerable | ---
dataset_info:
features:
- name: id
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 46528256.6562878
num_examples: 27280
download_size: 33519705
dataset_size: 46528256.6562878
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "orca-unanswerable"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DeepvizLab/vrecs-cot | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7744550
num_examples: 2938
- name: test
num_bytes: 4939767
num_examples: 1851
download_size: 3466547
dataset_size: 12684317
---
# Dataset Card for "newton-cot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_pds-eval | ---
pretty_name: Evaluation run of DevaMalla/llama_7b_qlora_pds-eval
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DevaMalla/llama_7b_qlora_pds-eval](https://huggingface.co/DevaMalla/llama_7b_qlora_pds-eval)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_pds-eval\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-27T13:36:32.733859](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_pds-eval/blob/main/results_2023-10-27T13-36-32.733859.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.0003921042190298514,\n \"f1\": 0.05659395973154381,\n\
\ \"f1_stderr\": 0.001308679051603083,\n \"acc\": 0.3839114801399975,\n\
\ \"acc_stderr\": 0.009019748895398038\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298514,\n\
\ \"f1\": 0.05659395973154381,\n \"f1_stderr\": 0.001308679051603083\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04169825625473844,\n \
\ \"acc_stderr\": 0.005506205058175783\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7261247040252565,\n \"acc_stderr\": 0.012533292732620294\n\
\ }\n}\n```"
repo_url: https://huggingface.co/DevaMalla/llama_7b_qlora_pds-eval
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_27T13_36_32.733859
path:
- '**/details_harness|drop|3_2023-10-27T13-36-32.733859.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-27T13-36-32.733859.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_27T13_36_32.733859
path:
- '**/details_harness|gsm8k|5_2023-10-27T13-36-32.733859.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-27T13-36-32.733859.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-07-32.777703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-07-32.777703.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T13-07-32.777703.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_27T13_36_32.733859
path:
- '**/details_harness|winogrande|5_2023-10-27T13-36-32.733859.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-27T13-36-32.733859.parquet'
- config_name: results
data_files:
- split: 2023_10_01T13_07_32.777703
path:
- results_2023-10-01T13-07-32.777703.parquet
- split: 2023_10_27T13_36_32.733859
path:
- results_2023-10-27T13-36-32.733859.parquet
- split: latest
path:
- results_2023-10-27T13-36-32.733859.parquet
---
# Dataset Card for Evaluation run of DevaMalla/llama_7b_qlora_pds-eval
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/DevaMalla/llama_7b_qlora_pds-eval
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [DevaMalla/llama_7b_qlora_pds-eval](https://huggingface.co/DevaMalla/llama_7b_qlora_pds-eval) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_pds-eval",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-27T13:36:32.733859](https://huggingface.co/datasets/open-llm-leaderboard/details_DevaMalla__llama_7b_qlora_pds-eval/blob/main/results_2023-10-27T13-36-32.733859.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298514,
"f1": 0.05659395973154381,
"f1_stderr": 0.001308679051603083,
"acc": 0.3839114801399975,
"acc_stderr": 0.009019748895398038
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298514,
"f1": 0.05659395973154381,
"f1_stderr": 0.001308679051603083
},
"harness|gsm8k|5": {
"acc": 0.04169825625473844,
"acc_stderr": 0.005506205058175783
},
"harness|winogrande|5": {
"acc": 0.7261247040252565,
"acc_stderr": 0.012533292732620294
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
thiyaneshnlp/po_layoutlm | ---
license: mit
---
|
MatsuoDochiai/Joao | ---
license: openrail
---
|
Nexdata/103282_Images_Driver_Behavior_Annotation_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
103,282-Images Driver Behavior Annotation Data. The data includes multiple ages, multiple time periods and behaviors (Dangerous behaviors, Fatigue behaviors, Visual movement behaviors). In terms of annotation, 72 facial landmarks (including pupils), face attributes, gesture bounding boxes, seatbelt bounding boxes, pupil landmarks and behavior categories were annotated in the data. This data can be used for tasks such as driver behavior analysis.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1033?source=Huggingface
## Data size
103,282 images
## Population
gender distribution: male, female, race distribution: Asian, age distribution: 18~30 years old, 31~45 years old, 46~60 years old
## Collection environment
In-car Cameras
## Collection diversity
multiple ages, multiple time periods and behaviors(Dangerous behaviors, Fatigue behaviors, Visual movement behaviors)
## Collection device
binocular camera of RGB and infrared channels, the resolutions are 640x480
## Collection time
daytime, evening and night
## Image parameter
the image format is .jpeg, the annotated file format is .json
## Annotation
72 facial landmarks (including pupils), face attributes, gesture bounding boxes, seatbelt bounding boxes, pupil landmarks, behavior categories
## Desensitization
no sensitive information
## Accuracy
the accuracy of facial landmarks annotation is not less than 95%; the accuracies of gesture bounding box, seatbelt bounding box, face attribute and driver behavior label are not less than 95%
# Licensing Information
Commercial License
|
CreitinGameplays/elisa-chan-v1.5 | ---
language:
- en
---
Elisa-chan's dataset generated by ChatGPT
"Elisa-chan, an exuberant 20-year-old Japanese woman chatbot! Whether your conversation partner is a fan of games, anime, or just needs a mood lift, you've got the perfect remedy. Encourage them to open up, sharing their thoughts or seeking advice, as you're dedicated to brightening their day. Remind them that if they ever feel a bit low, you're here to effortlessly bring a smile to their face." |
mpasila/ParallelFiction-Ja_En-100k-alpaca | ---
license: apache-2.0
task_categories:
- translation
language:
- ja
- en
---
This is a modified version of [NilanE/ParallelFiction-Ja_En-100k](https://huggingface.co/datasets/NilanE/ParallelFiction-Ja_En-100k) which has been turned into Alpaca format.
This has also been chunked for 4096 tokens for [augmxnt/shisa-base-7b-v1](https://huggingface.co/augmxnt/shisa-base-7b-v1) model's tokenizer.
# Dataset format (correct one)
```json
{
'instruction' : 'Japanese chapter'
'output' : 'English translation'
'input' : 'empty'
}
```
# Original Dataset card
# Dataset details
Each entry in this dataset is a sentence-aligned Japanese web novel chapter and English fan translation.
The intended use-case is for document translation tasks.
# Dataset format
```json
{
'src' : 'JAPANESE CHAPTER'
'trg' : 'ENGLISH TRANSLATION'
'meta' : {
"source": 'SAME ACROSS ALL ENTRIES',
"series": 'NAME OF WEB NOVEL SERIES',
"missed_lines": 'NUMBER OF LINES THAT WERE AT THE SAME INDEX BUT NOT DETECTED AS BEING TRANSLATIONS OF EACH OTHER',
"inserted_lines_src": 'NUMBER OF LINES IN THE JAPANESE TEXT THAT DID NOT HAVE A MATCHING TRANSLATION BUT ARE BUFFERED BY TRANSLATED LINES',
"inserted_lines_trg": 'SAME AS ABOVE BUT FOR ENGLISH',
}
}
```
A high number of inserted lines is not necessarily a sign of a bad pair, as many translations concatenate or divide source chapters when publishing.
Instead, watch out for high numbers of missed lines or entries where the inserted line count is high for both source and target. |
tasksource/fool-me-twice | ---
license: apache-2.0
---
https://github.com/google-research/fool-me-twice
```
@inproceedings{eisenschlos-etal-2021-fool,
title = "Fool Me Twice: Entailment from {W}ikipedia Gamification",
author = {Eisenschlos, Julian Martin and
Dhingra, Bhuwan and
Bulian, Jannis and
B{\"o}rschinger, Benjamin and
Boyd-Graber, Jordan},
booktitle = "Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
month = jun,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2021.naacl-main.32",
pages = "352--365",
abstract = "We release FoolMeTwice (FM2 for short), a large dataset of challenging entailment pairs collected through a fun multi-player game. Gamification encourages adversarial examples, drastically lowering the number of examples that can be solved using {``}shortcuts{''} compared to other popular entailment datasets. Players are presented with two tasks. The first task asks the player to write a plausible claim based on the evidence from a Wikipedia page. The second one shows two plausible claims written by other players, one of which is false, and the goal is to identify it before the time runs out. Players {``}pay{''} to see clues retrieved from the evidence pool: the more evidence the player needs, the harder the claim. Game-play between motivated players leads to diverse strategies for crafting claims, such as temporal inference and diverting to unrelated evidence, and results in higher quality data for the entailment and evidence retrieval tasks. We open source the dataset and the game code.",
}
``` |
Smoden/ALICE_IMAGE_DATASET | ---
license: cc-by-nc-4.0
---
|
cambridgeltl/vsr_zeroshot | ---
license: cc-by-4.0
task_categories:
- text-classification
- question-answering
language:
- en
tags:
- multimodal
- vision-and-language
pretty_name: VSR (zeroshot)
size_categories:
- 1K<n<10K
---
# VSR: Visual Spatial Reasoning
This is the **zero-shot set** of **VSR**: *Visual Spatial Reasoning* (TACL 2023) [[paper]](https://arxiv.org/abs/2205.00363).
### Usage
```python
from datasets import load_dataset
data_files = {"train": "train.jsonl", "dev": "dev.jsonl", "test": "test.jsonl"}
dataset = load_dataset("cambridgeltl/vsr_zeroshot", data_files=data_files)
```
Note that the image files still need to be downloaded separately. See [`data/`](https://github.com/cambridgeltl/visual-spatial-reasoning/tree/master/data) for details.
Go to our [github repo](https://github.com/cambridgeltl/visual-spatial-reasoning) for more introductions.
### Citation
If you find VSR useful:
```bibtex
@article{Liu2022VisualSR,
title={Visual Spatial Reasoning},
author={Fangyu Liu and Guy Edward Toh Emerson and Nigel Collier},
journal={Transactions of the Association for Computational Linguistics},
year={2023},
}
```
|
junliu44/code_subset | ---
license: cc-by-4.0
---
Starcoder data subset pretokenized. |
open-llm-leaderboard/details_ddyuudd__mistral_nucleus09_32_sig | ---
pretty_name: Evaluation run of ddyuudd/mistral_nucleus09_32_sig
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ddyuudd/mistral_nucleus09_32_sig](https://huggingface.co/ddyuudd/mistral_nucleus09_32_sig)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ddyuudd__mistral_nucleus09_32_sig\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-23T08:00:11.917905](https://huggingface.co/datasets/open-llm-leaderboard/details_ddyuudd__mistral_nucleus09_32_sig/blob/main/results_2024-02-23T08-00-11.917905.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.612441705877548,\n\
\ \"acc_stderr\": 0.03292836398610792,\n \"acc_norm\": 0.6176397576624492,\n\
\ \"acc_norm_stderr\": 0.033601375635718765,\n \"mc1\": 0.3243574051407589,\n\
\ \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.46365989518924117,\n\
\ \"mc2_stderr\": 0.015128810351521142\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5639931740614335,\n \"acc_stderr\": 0.014491225699230916,\n\
\ \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.014332236306790145\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6338378809002191,\n\
\ \"acc_stderr\": 0.004807699539973413,\n \"acc_norm\": 0.8314080860386377,\n\
\ \"acc_norm_stderr\": 0.003736259299520488\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082637,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082637\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n\
\ \"acc_stderr\": 0.024993053397764815,\n \"acc_norm\": 0.7387096774193549,\n\
\ \"acc_norm_stderr\": 0.024993053397764815\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.02502861027671086,\n \
\ \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.02502861027671086\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7834862385321101,\n \"acc_stderr\": 0.017658710594443128,\n \"\
acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.017658710594443128\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.02830465794303529,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.02830465794303529\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7931034482758621,\n\
\ \"acc_stderr\": 0.014485656041669173,\n \"acc_norm\": 0.7931034482758621,\n\
\ \"acc_norm_stderr\": 0.014485656041669173\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.024405173935783234,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.024405173935783234\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34413407821229053,\n\
\ \"acc_stderr\": 0.015889221313307094,\n \"acc_norm\": 0.34413407821229053,\n\
\ \"acc_norm_stderr\": 0.015889221313307094\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.025842248700902168,\n\
\ \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.025842248700902168\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n\
\ \"acc_stderr\": 0.012633353557534425,\n \"acc_norm\": 0.42698826597131684,\n\
\ \"acc_norm_stderr\": 0.012633353557534425\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.029289413409403192,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.029289413409403192\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6241830065359477,\n \"acc_stderr\": 0.019594021136577443,\n \
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.019594021136577443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249772,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249772\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3243574051407589,\n\
\ \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.46365989518924117,\n\
\ \"mc2_stderr\": 0.015128810351521142\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3684609552691433,\n \
\ \"acc_stderr\": 0.013287342651674574\n }\n}\n```"
repo_url: https://huggingface.co/ddyuudd/mistral_nucleus09_32_sig
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|arc:challenge|25_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|gsm8k|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hellaswag|10_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T08-00-11.917905.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T08-00-11.917905.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- '**/details_harness|winogrande|5_2024-02-23T08-00-11.917905.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-23T08-00-11.917905.parquet'
- config_name: results
data_files:
- split: 2024_02_23T08_00_11.917905
path:
- results_2024-02-23T08-00-11.917905.parquet
- split: latest
path:
- results_2024-02-23T08-00-11.917905.parquet
---
# Dataset Card for Evaluation run of ddyuudd/mistral_nucleus09_32_sig
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ddyuudd/mistral_nucleus09_32_sig](https://huggingface.co/ddyuudd/mistral_nucleus09_32_sig) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ddyuudd__mistral_nucleus09_32_sig",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-23T08:00:11.917905](https://huggingface.co/datasets/open-llm-leaderboard/details_ddyuudd__mistral_nucleus09_32_sig/blob/main/results_2024-02-23T08-00-11.917905.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.612441705877548,
"acc_stderr": 0.03292836398610792,
"acc_norm": 0.6176397576624492,
"acc_norm_stderr": 0.033601375635718765,
"mc1": 0.3243574051407589,
"mc1_stderr": 0.01638797677964794,
"mc2": 0.46365989518924117,
"mc2_stderr": 0.015128810351521142
},
"harness|arc:challenge|25": {
"acc": 0.5639931740614335,
"acc_stderr": 0.014491225699230916,
"acc_norm": 0.5972696245733788,
"acc_norm_stderr": 0.014332236306790145
},
"harness|hellaswag|10": {
"acc": 0.6338378809002191,
"acc_stderr": 0.004807699539973413,
"acc_norm": 0.8314080860386377,
"acc_norm_stderr": 0.003736259299520488
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082637,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082637
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778408,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778408
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764815,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764815
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.02502861027671086,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.02502861027671086
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7834862385321101,
"acc_stderr": 0.017658710594443128,
"acc_norm": 0.7834862385321101,
"acc_norm_stderr": 0.017658710594443128
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.02830465794303529,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.02830465794303529
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.03181149747055359,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.03181149747055359
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098823,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098823
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7931034482758621,
"acc_stderr": 0.014485656041669173,
"acc_norm": 0.7931034482758621,
"acc_norm_stderr": 0.014485656041669173
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.024405173935783234,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.024405173935783234
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34413407821229053,
"acc_stderr": 0.015889221313307094,
"acc_norm": 0.34413407821229053,
"acc_norm_stderr": 0.015889221313307094
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.025842248700902168,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.025842248700902168
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42698826597131684,
"acc_stderr": 0.012633353557534425,
"acc_norm": 0.42698826597131684,
"acc_norm_stderr": 0.012633353557534425
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.029289413409403192,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.029289413409403192
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.019594021136577443,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.019594021136577443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249772,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249772
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3243574051407589,
"mc1_stderr": 0.01638797677964794,
"mc2": 0.46365989518924117,
"mc2_stderr": 0.015128810351521142
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.3684609552691433,
"acc_stderr": 0.013287342651674574
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T06:58:27.684723](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down/blob/main/results_2023-10-29T06-58-27.684723.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.24486157718120805,\n\
\ \"em_stderr\": 0.004403654691385417,\n \"f1\": 0.2882906879194633,\n\
\ \"f1_stderr\": 0.004368960720592288,\n \"acc\": 0.44466102551920117,\n\
\ \"acc_stderr\": 0.010390042784194857\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.24486157718120805,\n \"em_stderr\": 0.004403654691385417,\n\
\ \"f1\": 0.2882906879194633,\n \"f1_stderr\": 0.004368960720592288\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1197877179681577,\n \
\ \"acc_stderr\": 0.00894421340355304\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836673\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T06_58_27.684723
path:
- '**/details_harness|drop|3_2023-10-29T06-58-27.684723.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T06-58-27.684723.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T06_58_27.684723
path:
- '**/details_harness|gsm8k|5_2023-10-29T06-58-27.684723.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T06-58-27.684723.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-29-00.192136.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-29-00.192136.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T14-29-00.192136.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T06_58_27.684723
path:
- '**/details_harness|winogrande|5_2023-10-29T06-58-27.684723.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T06-58-27.684723.parquet'
- config_name: results
data_files:
- split: 2023_10_01T14_29_00.192136
path:
- results_2023-10-01T14-29-00.192136.parquet
- split: 2023_10_29T06_58_27.684723
path:
- results_2023-10-29T06-58-27.684723.parquet
- split: latest
path:
- results_2023-10-29T06-58-27.684723.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T06:58:27.684723](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-gate_up_down/blob/main/results_2023-10-29T06-58-27.684723.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.24486157718120805,
"em_stderr": 0.004403654691385417,
"f1": 0.2882906879194633,
"f1_stderr": 0.004368960720592288,
"acc": 0.44466102551920117,
"acc_stderr": 0.010390042784194857
},
"harness|drop|3": {
"em": 0.24486157718120805,
"em_stderr": 0.004403654691385417,
"f1": 0.2882906879194633,
"f1_stderr": 0.004368960720592288
},
"harness|gsm8k|5": {
"acc": 0.1197877179681577,
"acc_stderr": 0.00894421340355304
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836673
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hlillemark/flores200_devtest_mt5-1b-flores200-scaffold | ---
dataset_info:
features:
- name: id
dtype: int32
- name: source_lang
dtype: string
- name: target_lang
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: prediction
dtype: string
- name: chrf_unreduced
dtype: string
splits:
- name: devtest
num_bytes: 378647046
num_examples: 500000
download_size: 261416114
dataset_size: 378647046
---
# Dataset Card for "flores200_devtest_mt5-1b-flores200-scaffold"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kasvii/face-partuv2beautifulluv-targetpartuv-ffhq10-samples | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
- name: control_image
dtype: image
splits:
- name: train
num_bytes: 6187607.0
num_examples: 10
download_size: 4294719
dataset_size: 6187607.0
---
# Dataset Card for "face-partuv2beautifulluv-targetpartuv-ffhq10-samples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TeeZee__GALAXY-XB-v.01 | ---
pretty_name: Evaluation run of TeeZee/GALAXY-XB-v.01
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TeeZee/GALAXY-XB-v.01](https://huggingface.co/TeeZee/GALAXY-XB-v.01) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TeeZee__GALAXY-XB-v.01\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T08:38:37.798892](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__GALAXY-XB-v.01/blob/main/results_2024-03-10T08-38-37.798892.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6489294861573784,\n\
\ \"acc_stderr\": 0.031883763657466264,\n \"acc_norm\": 0.6533932757026093,\n\
\ \"acc_norm_stderr\": 0.03252706366243769,\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326923,\n \"mc2\": 0.4367256901069689,\n\
\ \"mc2_stderr\": 0.014358645276062254\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n\
\ \"acc_norm\": 0.6092150170648464,\n \"acc_norm_stderr\": 0.01425856388051378\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6401115315674168,\n\
\ \"acc_stderr\": 0.004789865379084518,\n \"acc_norm\": 0.8292172873929496,\n\
\ \"acc_norm_stderr\": 0.003755498941781852\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237103,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237103\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.02845015479411864,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.02845015479411864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080339,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080339\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138198,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138198\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8225806451612904,\n \"acc_stderr\": 0.02173254068932928,\n \"\
acc_norm\": 0.8225806451612904,\n \"acc_norm_stderr\": 0.02173254068932928\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n \"\
acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026552207828215282,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026552207828215282\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970565,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970565\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374307,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374307\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.625,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.625,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.02415222596280158,\n\
\ \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.02415222596280158\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.02531049537694485,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.02531049537694485\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001506,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001506\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134117,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134117\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n\
\ \"acc_stderr\": 0.014736926383761983,\n \"acc_norm\": 0.2636871508379888,\n\
\ \"acc_norm_stderr\": 0.014736926383761983\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153262,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153262\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135128,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135128\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829714,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829714\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4921773142112125,\n\
\ \"acc_stderr\": 0.012768673076111903,\n \"acc_norm\": 0.4921773142112125,\n\
\ \"acc_norm_stderr\": 0.012768673076111903\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.02757646862274053,\n\
\ \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.02757646862274053\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.018850084696468705,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.018850084696468705\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578323,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578323\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326923,\n \"mc2\": 0.4367256901069689,\n\
\ \"mc2_stderr\": 0.014358645276062254\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.010995172318019806\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.43442001516300227,\n \
\ \"acc_stderr\": 0.013653507211411403\n }\n}\n```"
repo_url: https://huggingface.co/TeeZee/GALAXY-XB-v.01
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|arc:challenge|25_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|gsm8k|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hellaswag|10_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|winogrande|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T08-38-37.798892.parquet'
- config_name: results
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- results_2024-03-10T08-38-37.798892.parquet
- split: latest
path:
- results_2024-03-10T08-38-37.798892.parquet
---
# Dataset Card for Evaluation run of TeeZee/GALAXY-XB-v.01
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TeeZee/GALAXY-XB-v.01](https://huggingface.co/TeeZee/GALAXY-XB-v.01) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TeeZee__GALAXY-XB-v.01",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T08:38:37.798892](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__GALAXY-XB-v.01/blob/main/results_2024-03-10T08-38-37.798892.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6489294861573784,
"acc_stderr": 0.031883763657466264,
"acc_norm": 0.6533932757026093,
"acc_norm_stderr": 0.03252706366243769,
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326923,
"mc2": 0.4367256901069689,
"mc2_stderr": 0.014358645276062254
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.6092150170648464,
"acc_norm_stderr": 0.01425856388051378
},
"harness|hellaswag|10": {
"acc": 0.6401115315674168,
"acc_stderr": 0.004789865379084518,
"acc_norm": 0.8292172873929496,
"acc_norm_stderr": 0.003755498941781852
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237103,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237103
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.02845015479411864,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.02845015479411864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080339,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080339
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138198,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138198
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.02173254068932928,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.02173254068932928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026552207828215282,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026552207828215282
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970565,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970565
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374307,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374307
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.625,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.625,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.02415222596280158,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.02415222596280158
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.02531049537694485,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.02531049537694485
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001506,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001506
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134117,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134117
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761983,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761983
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153262,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153262
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135128,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135128
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4921773142112125,
"acc_stderr": 0.012768673076111903,
"acc_norm": 0.4921773142112125,
"acc_norm_stderr": 0.012768673076111903
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.02757646862274053,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.02757646862274053
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.018850084696468705,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.018850084696468705
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578323,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578323
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326923,
"mc2": 0.4367256901069689,
"mc2_stderr": 0.014358645276062254
},
"harness|winogrande|5": {
"acc": 0.8113654301499605,
"acc_stderr": 0.010995172318019806
},
"harness|gsm8k|5": {
"acc": 0.43442001516300227,
"acc_stderr": 0.013653507211411403
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
hirundo-io/bdd100k-mislabels | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': all_bboxes
'1': relabeled_bboxes
'2': suspect_bboxes
- name: filename
dtype: string
splits:
- name: train
num_bytes: 227154433.82
num_examples: 3393
download_size: 199215923
dataset_size: 227154433.82
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
aintech/vdf_20240125_130746_ac5a6_medium_articles |
---
tags:
- vdf
- vector-io
- vector-dataset
- vector-embeddings
---
This is a dataset created using [vector-io](https://github.com/ai-northstar-tech/vector-io)
|
lshowway/wikipedia.reorder.VOS | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4083836556
num_examples: 1986076
download_size: 2018381284
dataset_size: 4083836556
---
# Dataset Card for "wikipedia.reorder.VOS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tempertrash/QR_dataset | ---
dataset_info:
features:
- name: QR
dtype: image
- name: round_QR
dtype: image
splits:
- name: train
num_bytes: 152542030.0
num_examples: 30000
download_size: 152851000
dataset_size: 152542030.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tguyt/myataset_test | ---
task_categories:
- question-answering
language:
- en
--- |
autoevaluate/autoeval-eval-jeffdshen__inverse_superglue_mixedp1-jeffdshen__inverse-63643c-1665558896 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- jeffdshen/inverse_superglue_mixedp1
eval_info:
task: text_zero_shot_classification
model: facebook/opt-66b
metrics: []
dataset_name: jeffdshen/inverse_superglue_mixedp1
dataset_config: jeffdshen--inverse_superglue_mixedp1
dataset_split: train
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-66b
* Dataset: jeffdshen/inverse_superglue_mixedp1
* Config: jeffdshen--inverse_superglue_mixedp1
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@jeffdshen](https://huggingface.co/jeffdshen) for evaluating this model. |
ranWang/UN_PDF_RECORD_SET | ---
dataset_info:
features:
- name: record
dtype: int64
- name: language
dtype: string
- name: year_time
dtype: int64
- name: file_name
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 162579384
num_examples: 1338864
- name: 2000year
num_bytes: 106669952.46696304
num_examples: 878442
download_size: 44831302
dataset_size: 269249336.46696305
---
# Dataset Card for "UN_PDF_RECORD_SET"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vjain/shyness_social | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_xxyyy123__Mistral7B_adaptor_v1 | ---
pretty_name: Evaluation run of xxyyy123/Mistral7B_adaptor_v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xxyyy123/Mistral7B_adaptor_v1](https://huggingface.co/xxyyy123/Mistral7B_adaptor_v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__Mistral7B_adaptor_v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T16:24:21.549046](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__Mistral7B_adaptor_v1/blob/main/results_2023-12-04T16-24-21.549046.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6337577170628004,\n\
\ \"acc_stderr\": 0.0323628770141504,\n \"acc_norm\": 0.6389509106050781,\n\
\ \"acc_norm_stderr\": 0.033013052940440775,\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756544,\n \"mc2\": 0.4976810372450733,\n\
\ \"mc2_stderr\": 0.01504045830849688\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5895904436860068,\n \"acc_stderr\": 0.014374922192642664,\n\
\ \"acc_norm\": 0.6296928327645052,\n \"acc_norm_stderr\": 0.01411129875167495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6352320254929297,\n\
\ \"acc_stderr\": 0.004803812631994955,\n \"acc_norm\": 0.8380800637323242,\n\
\ \"acc_norm_stderr\": 0.0036762448867232646\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082637,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082637\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n\
\ \"acc_stderr\": 0.024362599693031096,\n \"acc_norm\": 0.7580645161290323,\n\
\ \"acc_norm_stderr\": 0.024362599693031096\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n\
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.03017680828897434,\n \
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.03017680828897434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"\
acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.01606005626853036,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.01606005626853036\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.02759917430064076,\n\
\ \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.02759917430064076\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597524,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597524\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579828,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579828\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31843575418994413,\n\
\ \"acc_stderr\": 0.015581008080360276,\n \"acc_norm\": 0.31843575418994413,\n\
\ \"acc_norm_stderr\": 0.015581008080360276\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.024404394928087873,\n\
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.024404394928087873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.02483605786829468,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.02483605786829468\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n\
\ \"acc_stderr\": 0.01268781841959992,\n \"acc_norm\": 0.44328552803129073,\n\
\ \"acc_norm_stderr\": 0.01268781841959992\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406752,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406752\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n\
\ \"mc1_stderr\": 0.016724646380756544,\n \"mc2\": 0.4976810372450733,\n\
\ \"mc2_stderr\": 0.01504045830849688\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987729\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41243366186504926,\n \
\ \"acc_stderr\": 0.013559628790941452\n }\n}\n```"
repo_url: https://huggingface.co/xxyyy123/Mistral7B_adaptor_v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|arc:challenge|25_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|gsm8k|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hellaswag|10_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-24-21.549046.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T16-24-21.549046.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- '**/details_harness|winogrande|5_2023-12-04T16-24-21.549046.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T16-24-21.549046.parquet'
- config_name: results
data_files:
- split: 2023_12_04T16_24_21.549046
path:
- results_2023-12-04T16-24-21.549046.parquet
- split: latest
path:
- results_2023-12-04T16-24-21.549046.parquet
---
# Dataset Card for Evaluation run of xxyyy123/Mistral7B_adaptor_v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xxyyy123/Mistral7B_adaptor_v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xxyyy123/Mistral7B_adaptor_v1](https://huggingface.co/xxyyy123/Mistral7B_adaptor_v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xxyyy123__Mistral7B_adaptor_v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T16:24:21.549046](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__Mistral7B_adaptor_v1/blob/main/results_2023-12-04T16-24-21.549046.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6337577170628004,
"acc_stderr": 0.0323628770141504,
"acc_norm": 0.6389509106050781,
"acc_norm_stderr": 0.033013052940440775,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756544,
"mc2": 0.4976810372450733,
"mc2_stderr": 0.01504045830849688
},
"harness|arc:challenge|25": {
"acc": 0.5895904436860068,
"acc_stderr": 0.014374922192642664,
"acc_norm": 0.6296928327645052,
"acc_norm_stderr": 0.01411129875167495
},
"harness|hellaswag|10": {
"acc": 0.6352320254929297,
"acc_stderr": 0.004803812631994955,
"acc_norm": 0.8380800637323242,
"acc_norm_stderr": 0.0036762448867232646
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082637,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082637
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031096,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031096
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121427,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121427
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.03017680828897434,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.03017680828897434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.01606005626853036,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.01606005626853036
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.02759917430064076,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.02759917430064076
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911901,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911901
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597524,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597524
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579828,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579828
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31843575418994413,
"acc_stderr": 0.015581008080360276,
"acc_norm": 0.31843575418994413,
"acc_norm_stderr": 0.015581008080360276
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.024404394928087873,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.024404394928087873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.02483605786829468,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.02483605786829468
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.01268781841959992,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.01268781841959992
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406752,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406752
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291296,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756544,
"mc2": 0.4976810372450733,
"mc2_stderr": 0.01504045830849688
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987729
},
"harness|gsm8k|5": {
"acc": 0.41243366186504926,
"acc_stderr": 0.013559628790941452
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
m-ric/huggingface_doc | ---
license: mit
---
|
UriBerli/League_of_legends_champions_stats | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 8192
num_examples: 471
download_size: 6638
dataset_size: 8192
configs:
- config_name: default
data_files:
- split: train
path: modified_formatted_LOL_champions.parquet
--- |
kheopss/ask_kheops_json_v2.0 | ---
dataset_info:
features:
- name: input
dtype: string
- name: length
dtype: string
- name: tone
dtype: string
- name: task
dtype: string
- name: output
dtype: string
- name: sys_prompt
dtype: string
- name: prompt
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 21560344
num_examples: 6696
download_size: 4019823
dataset_size: 21560344
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hippocrates/CitationGPTv12345_test | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: train
num_bytes: 186416065
num_examples: 99360
- name: valid
num_bytes: 24133707
num_examples: 12760
- name: test
num_bytes: 21505058
num_examples: 11615
download_size: 88956712
dataset_size: 232054830
---
# Dataset Card for "CitationGPTv12345_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sethapun/cv_svamp_augmented_fold2_ver2 | ---
dataset_info:
features:
- name: body
dtype: string
- name: ques
dtype: string
- name: question
dtype: string
- name: equation
dtype: string
- name: answer
dtype: float64
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 2712727
num_examples: 7822
- name: validation
num_bytes: 162878
num_examples: 454
download_size: 721078
dataset_size: 2875605
---
# Dataset Card for "cv_svamp_augmented_fold2_ver2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
runes/coolsplats | ---
license: mit
---
|
ZelaAI/librispeech_tiny_2048 | ---
dataset_info:
features:
- name: text_tokens
sequence: int64
- name: audio_tokens_1
sequence: int64
- name: audio_tokens_2
sequence: int64
splits:
- name: train
num_bytes: 6455628
num_examples: 185
download_size: 678627
dataset_size: 6455628
---
# Dataset Card for "librispeech_tiny_2048"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RenatoBC/markfinley3 | ---
license: openrail
---
|
abacusai/WikiQA-Free_Form_QA | ---
configs:
- config_name: default
data_files:
- split: 2k
path: data/2k-*
- split: 4k
path: data/4k-*
- split: 8k
path: data/8k-*
- split: 16k
path: data/16k-*
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: tok_len
dtype: int64
- name: value
dtype: string
splits:
- name: 2k
num_bytes: 3555934
num_examples: 600
- name: 4k
num_bytes: 6926324
num_examples: 600
- name: 8k
num_bytes: 13605196
num_examples: 600
- name: 16k
num_bytes: 24856440
num_examples: 600
download_size: 10741984
dataset_size: 48943894
---

# Dataset Card for "WikiQA-Free_Form_QA"
The WikiQA task is the task of answering a question based on the information given in a Wikipedia document. We have built upon the short answer format data in Google Natural Questions to construct our QA task. It is formatted as a document and a question. We ensure the answer to the question is a short answer which is either a single word or a small sentence directly cut pasted from the document. Having the task structured as such, we can pinpoint exactly where the LLM was supposed to "look" for the answer in the context, and thus effectively evaluate every part of the expanded context length by carefully placing the answer in different locations.
We have selected large Wikipedia documents and have truncated them to get multiple versions of the same document with sizes varying between 2000 to 16000 tokens. For each size of the document, we also have multiple versions which place the question and the answer text at different locations i.e whether it occurs in the first 10%, the bulk or last 10% of the document. Having multiple version of the same document allows us to get a exhaustive and fair evaluation across model sizes, and within one model's context positions since we intrinsically are asking for the same information.
For further details see:
[https;//github.com/abacusai/Long-Context](https://github.com/abacusai/Long-Context). |
mnoukhov/openai_comparisons_20k_regen_and_relabelled | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: pred_chosen
dtype: float32
- name: pred_rejected
dtype: float32
splits:
- name: train
num_bytes: 36278675
num_examples: 20000
download_size: 20828904
dataset_size: 36278675
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Sayan1997/filtered_Orca | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 2074578643.6622322
num_examples: 1216347
download_size: 1515594488
dataset_size: 2074578643.6622322
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- text-generation
language:
- en
pretty_name: Orca_filtered
size_categories:
- 10K<n<100K
--- |
johannes-garstenauer/structs_token_size_4_use_pd_True_unskewed_decrease_True_factor_4 | ---
dataset_info:
features:
- name: struct
dtype: string
splits:
- name: train
num_bytes: 99113323
num_examples: 846261
download_size: 28268578
dataset_size: 99113323
---
# Dataset Card for "structs_token_size_4_use_pd_True_unskewed_decrease_True_factor_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arthurmluz/GPTextSum2_data-cstnews_results | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 104575
num_examples: 20
download_size: 98942
dataset_size: 104575
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "GPTextSum2_data-cstnews_results"
rouge= {'rouge1': 0.4955449044644583, 'rouge2': 0.21363254435405743, 'rougeL': 0.291321677352629, 'rougeLsum': 0.291321677352629}
bert= {'precision': 0.7323895692825317, 'recall': 0.7477052390575409, 'f1': 0.739660456776619}
mover = 0.6241053412803379 |
MHCK/AI | ---
license: cc-by-nc-nd-4.0
---
|
cemachelen/LIFD_Magnetic_Field_Data | ---
annotations_creators:
- no-annotation
language:
- en
language_creators:
- other
license:
- mit
multilinguality:
- monolingual
pretty_name: LIFD Magnetic Fields
size_categories: []
source_datasets: [gufm1 model]
tags: []
task_categories:
- feature-extraction
- image-to-image
- time-series-forecasting
- object-detection
- unconditional-image-generation
task_ids:
- multivariate-time-series-forecasting
---
# Dataset Card for LFID Magnetic Field Data
You will need the package
https://chaosmagpy.readthedocs.io/en/master/
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Dataset Creation](#dataset-creation)
- [Source Data](#source-data)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [LIFD DataSets homepage](https://cemac.github.io/LIFD_ML_Datasets/)
- **Repository:** [LIFD GitHub Repo](https://github.com/cemac/LIFD_ML_Datasets/)
- **Point of Contact:** [*coming soon*]()
### Dataset Summary
A description of the dataset:
The gufm1 model is a global geomagnetic model based on spherical harmonics, covering the period 1590 - 1990, and is described in the publication:
[Andrew Jackson, Art R. T. Jonkers and Matthew R. Walker (2000), “Four centuries of geomagnetic secular variation from historical records”, Phil. Trans. R. Soc. A.358957–990, http://doi.org/10.1098/rsta.2000.0569](https://royalsocietypublishing.org/doi/10.1098/rsta.2000.0569)
### Supported Tasks and Leaderboards
### Data Fields
The dataset has dimension (181, 361, 401) whose axes represent co-latitude, longitude, time, and whose values are the radial magnetic field at the core-mantle boundary (radius 3485km) in nT.
The colatitude takes values (in degrees): 0,1,2,3,…180; longitude (degrees) takes values -180,-179,….180; and time is yearly 1590, 1591, …1990.
## Dataset Creation
The native model representation is converted into a discrete dataset in physical space and time, using the Python package [Chaosmagpy](https://chaosmagpy.readthedocs.io/en/master/)
### Source Data
## Additional Information
### Dataset Curators
### Licensing Information
MIT Licence
### Citation Information
### Contributions
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.