datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
ahmed-ai/skin-lesions-classification-dataset | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Actinic keratoses
'1': Basal cell carcinoma
'2': Benign keratosis-like lesions
'3': Chickenpox
'4': Cowpox
'5': Dermatofibroma
'6': HFMD
'7': Healthy
'8': Measles
'9': Melanocytic nevi
'10': Melanoma
'11': Monkeypox
'12': Squamous cell carcinoma
'13': Vascular lesions
splits:
- name: train
num_bytes: 11781822388.236
num_examples: 29322
- name: validation
num_bytes: 1129580056.38
num_examples: 3660
- name: test
num_bytes: 1166877801.52
num_examples: 3674
download_size: 9960809758
dataset_size: 14078280246.136002
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Skin Lesions Dataset
A dataset for 14 types of skin lesions classification consisted of merging [HAM10000(2019)](https://www.kaggle.com/datasets/andrewmvd/isic-2019) and [MSLDv2.0](https://www.kaggle.com/datasets/joydippaul/mpox-skin-lesion-dataset-version-20-msld-v20)
The dataset consisted of 14 categories:
- Actinic keratoses
- Basal cell carcinoma
- Benign keratosis-like-lesions
- Chickenpox
- Cowpox
- Dermatofibroma
- Healthy
- HFMD
- Measles
- Melanocytic nevi
- Melanoma
- Monkeypox
- Squamous cell carcinoma
- Vascular lesions
## Load the dataset
```python
from datasets import load_dataset
dataset = load_dataset("ahmed-ai/skin-lesions-dataset")
```
Citation for the original datasets
### MSLDv2.0
```
@article{Nafisa2023,
title={A Web-based Mpox Skin Lesion Detection System Using State-of-the-art Deep Learning Models Considering Racial Diversity},
author={Ali, Shams Nafisa and Ahmed, Md. Tazuddin and Jahan, Tasnim and Paul, Joydip and Sani, S. M. Sakeef and Noor, Nawshaba and Asma, Anzirun Nahar and Hasan, Taufiq},
journal={arXiv preprint arXiv:2306.14169},
year={2023}
}
```
### HAM10000 (2019)
```
BCN_20000 Dataset: (c) Department of Dermatology, Hospital Clínic de Barcelona
HAM10000 Dataset: (c) by ViDIR Group, Department of Dermatology, Medical University of Vienna; https://doi.org/10.1038/sdata.2018.161
MSK Dataset: (c) Anonymous; https://arxiv.org/abs/1710.05006; https://arxiv.org/abs/1902.03368
```
|
ShenaoZhang/0.001_idpo_4iters_ref_response | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: reference_response
dtype: string
splits:
- name: train_prefs_1
num_bytes: 125659628
num_examples: 15283
- name: test_prefs_1
num_bytes: 16380615
num_examples: 2000
- name: train_prefs_2
num_bytes: 127949761
num_examples: 15283
- name: test_prefs_2
num_bytes: 16634946
num_examples: 2000
- name: train_prefs_3
num_bytes: 128829232
num_examples: 15283
- name: test_prefs_3
num_bytes: 16759815
num_examples: 2000
download_size: 239309329
dataset_size: 432213997
configs:
- config_name: default
data_files:
- split: train_prefs_1
path: data/train_prefs_1-*
- split: test_prefs_1
path: data/test_prefs_1-*
- split: train_prefs_2
path: data/train_prefs_2-*
- split: test_prefs_2
path: data/test_prefs_2-*
- split: train_prefs_3
path: data/train_prefs_3-*
- split: test_prefs_3
path: data/test_prefs_3-*
---
# Dataset Card for "0.001_idpo_4iters_ref_response"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibranze/araproje_hellaswag_tr_conf_mixis | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 87101
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_conf_mixis"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KatoHF/ultrachat_200k_scored | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: text
dtype: string
- name: score
dtype: float32
splits:
- name: train_sft
num_bytes: 2631213458
num_examples: 207865
- name: test_sft
num_bytes: 291121933
num_examples: 23110
download_size: 1490643246
dataset_size: 2922335391
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
---
|
Harshithacj123/new_data_model_methanol | ---
dataset_info:
features:
- name: Train
dtype: string
splits:
- name: train
num_bytes: 3446
num_examples: 5
download_size: 7910
dataset_size: 3446
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-zeroshot__twitter-financial-news-topic-zeroshot__twitte-178919-28982144929 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- zeroshot/twitter-financial-news-topic
eval_info:
task: summarization
model: facebook/bart-large-cnn
metrics: ['bertscore']
dataset_name: zeroshot/twitter-financial-news-topic
dataset_config: zeroshot--twitter-financial-news-topic
dataset_split: train
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: facebook/bart-large-cnn
* Dataset: zeroshot/twitter-financial-news-topic
* Config: zeroshot--twitter-financial-news-topic
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@peterdevathala](https://huggingface.co/peterdevathala) for evaluating this model. |
roborovski/phi-2-embeddings | ---
dataset_info:
features:
- name: hexsha
dtype: string
- name: size
dtype: int64
- name: ext
dtype: string
- name: lang
dtype: string
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_repo_head_hexsha
dtype: string
- name: max_stars_repo_licenses
sequence: string
- name: max_stars_count
dtype: int64
- name: max_stars_repo_stars_event_min_datetime
dtype: string
- name: max_stars_repo_stars_event_max_datetime
dtype: string
- name: max_issues_repo_path
dtype: string
- name: max_issues_repo_name
dtype: string
- name: max_issues_repo_head_hexsha
dtype: string
- name: max_issues_repo_licenses
sequence: string
- name: max_issues_count
dtype: int64
- name: max_issues_repo_issues_event_min_datetime
dtype: string
- name: max_issues_repo_issues_event_max_datetime
dtype: string
- name: max_forks_repo_path
dtype: string
- name: max_forks_repo_name
dtype: string
- name: max_forks_repo_head_hexsha
dtype: string
- name: max_forks_repo_licenses
sequence: string
- name: max_forks_count
dtype: int64
- name: max_forks_repo_forks_event_min_datetime
dtype: string
- name: max_forks_repo_forks_event_max_datetime
dtype: string
- name: content
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphanum_fraction
dtype: float64
- name: label
dtype: int64
- name: cost
dtype: float64
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 437613327
num_examples: 50000
download_size: 297833753
dataset_size: 437613327
---
# Dataset Card for "phi-2-embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DFKI-SLT/knowledge_net | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- found
license: []
multilinguality:
- monolingual
pretty_name: KnowledgeNet is a dataset for automatically populating a knowledge base
size_categories:
- 10K<n<100K
source_datasets: []
tags:
- knowledgenet
task_categories:
- text-classification
task_ids:
- multi-class-classification
- entity-linking-classification
dataset_info:
- config_name: knet
features:
- name: fold
dtype: int32
- name: documentId
dtype: string
- name: source
dtype: string
- name: documentText
dtype: string
- name: passages
sequence:
- name: passageId
dtype: string
- name: passageStart
dtype: int32
- name: passageEnd
dtype: int32
- name: passageText
dtype: string
- name: exhaustivelyAnnotatedProperties
sequence:
- name: propertyId
dtype: string
- name: propertyName
dtype: string
- name: propertyDescription
dtype: string
- name: facts
sequence:
- name: factId
dtype: string
- name: propertyId
dtype: string
- name: humanReadable
dtype: string
- name: annotatedPassage
dtype: string
- name: subjectStart
dtype: int32
- name: subjectEnd
dtype: int32
- name: subjectText
dtype: string
- name: subjectUri
dtype: string
- name: objectStart
dtype: int32
- name: objectEnd
dtype: int32
- name: objectText
dtype: string
- name: objectUri
dtype: string
splits:
- name: train
num_bytes: 10161415
num_examples: 3977
download_size: 14119313
dataset_size: 10161415
- config_name: knet_tokenized
features:
- name: doc_id
dtype: string
- name: passage_id
dtype: string
- name: fact_id
dtype: string
- name: tokens
sequence: string
- name: subj_start
dtype: int32
- name: subj_end
dtype: int32
- name: subj_type
dtype:
class_label:
names:
'0': O
'1': PER
'2': ORG
'3': LOC
'4': DATE
- name: subj_uri
dtype: string
- name: obj_start
dtype: int32
- name: obj_end
dtype: int32
- name: obj_type
dtype:
class_label:
names:
'0': O
'1': PER
'2': ORG
'3': LOC
'4': DATE
- name: obj_uri
dtype: string
- name: relation
dtype:
class_label:
names:
'0': NO_RELATION
'1': DATE_OF_BIRTH
'2': DATE_OF_DEATH
'3': PLACE_OF_RESIDENCE
'4': PLACE_OF_BIRTH
'5': NATIONALITY
'6': EMPLOYEE_OR_MEMBER_OF
'7': EDUCATED_AT
'8': POLITICAL_AFFILIATION
'9': CHILD_OF
'10': SPOUSE
'11': DATE_FOUNDED
'12': HEADQUARTERS
'13': SUBSIDIARY_OF
'14': FOUNDED_BY
'15': CEO
splits:
- name: train
num_bytes: 4511963
num_examples: 10895
download_size: 14119313
dataset_size: 4511963
- config_name: knet_re
features:
- name: documentId
dtype: string
- name: passageId
dtype: string
- name: factId
dtype: string
- name: passageText
dtype: string
- name: humanReadable
dtype: string
- name: annotatedPassage
dtype: string
- name: subjectStart
dtype: int32
- name: subjectEnd
dtype: int32
- name: subjectText
dtype: string
- name: subjectType
dtype:
class_label:
names:
'0': O
'1': PER
'2': ORG
'3': LOC
'4': DATE
- name: subjectUri
dtype: string
- name: objectStart
dtype: int32
- name: objectEnd
dtype: int32
- name: objectText
dtype: string
- name: objectType
dtype:
class_label:
names:
'0': O
'1': PER
'2': ORG
'3': LOC
'4': DATE
- name: objectUri
dtype: string
- name: relation
dtype:
class_label:
names:
'0': NO_RELATION
'1': DATE_OF_BIRTH
'2': DATE_OF_DEATH
'3': PLACE_OF_RESIDENCE
'4': PLACE_OF_BIRTH
'5': NATIONALITY
'6': EMPLOYEE_OR_MEMBER_OF
'7': EDUCATED_AT
'8': POLITICAL_AFFILIATION
'9': CHILD_OF
'10': SPOUSE
'11': DATE_FOUNDED
'12': HEADQUARTERS
'13': SUBSIDIARY_OF
'14': FOUNDED_BY
'15': CEO
splits:
- name: train
num_bytes: 6098219
num_examples: 10895
download_size: 14119313
dataset_size: 6098219
---
# Dataset Card for "KnowledgeNet"
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [knowledge-net](https://github.com/diffbot/knowledge-net)
- **Paper:** [KnowledgeNet: A Benchmark Dataset for Knowledge Base Population](https://aclanthology.org/D19-1069/)
- **Size of downloaded dataset files:** 12.59 MB
- **Size of the generated dataset:** 6.1 MB
### Dataset Summary
KnowledgeNet is a benchmark dataset for the task of automatically populating a knowledge base (Wikidata) with facts
expressed in natural language text on the web. KnowledgeNet provides text exhaustively annotated with facts, thus
enabling the holistic end-to-end evaluation of knowledge base population systems as a whole, unlike previous benchmarks
that are more suitable for the evaluation of individual subcomponents (e.g., entity linking, relation extraction).
For instance, the dataset contains text expressing the fact (Gennaro Basile; RESIDENCE; Moravia), in the passage:
"Gennaro Basile was an Italian painter, born in Naples but active in the German-speaking countries. He settled at Brünn,
in Moravia, and lived about 1756..."
For a description of the dataset and baseline systems, please refer to their
[EMNLP paper](https://github.com/diffbot/knowledge-net/blob/master/knowledgenet-emnlp-cameraready.pdf).
Note: This Datasetreader currently only supports the `train` split and does not contain negative examples.
In addition to the original format this repository also provides two version (`knet_re`, `knet_tokenized`) that are
easier to use for simple relation extraction. You can load them with
`datasets.load_dataset("DFKI-SLT/knowledge_net", name="<config>")`.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
The language in the dataset is English.
## Dataset Structure
### Data Instances
#### knet
- **Size of downloaded dataset files:** 12.59 MB
- **Size of the generated dataset:** 10.16 MB
An example of 'train' looks as follows:
```json
{
"fold": 2,
"documentId": "8313",
"source": "DBpedia Abstract",
"documentText": "Gennaro Basile\n\nGennaro Basile was an Italian painter, born in Naples but active in the German-speaking countries. He settled at Brünn, in Moravia, and lived about 1756. His best picture is the altar-piece in the chapel of the chateau at Seeberg, in Salzburg. Most of his works remained in Moravia.",
"passages": [
{
"passageId": "8313:16:114",
"passageStart": 16,
"passageEnd": 114,
"passageText": "Gennaro Basile was an Italian painter, born in Naples but active in the German-speaking countries.",
"exhaustivelyAnnotatedProperties": [
{
"propertyId": "12",
"propertyName": "PLACE_OF_BIRTH",
"propertyDescription": "Describes the relationship between a person and the location where she/he was born."
}
],
"facts": [
{
"factId": "8313:16:30:63:69:12",
"propertyId": "12",
"humanReadable": "<Gennaro Basile> <PLACE_OF_BIRTH> <Naples>",
"annotatedPassage": "<Gennaro Basile> was an Italian painter, born in <Naples> but active in the German-speaking countries.",
"subjectStart": 16,
"subjectEnd": 30,
"subjectText": "Gennaro Basile",
"subjectUri": "http://www.wikidata.org/entity/Q19517888",
"objectStart": 63,
"objectEnd": 69,
"objectText": "Naples",
"objectUri": "http://www.wikidata.org/entity/Q2634"
}
]
},
{
"passageId": "8313:115:169",
"passageStart": 115,
"passageEnd": 169,
"passageText": "He settled at Brünn, in Moravia, and lived about 1756.",
"exhaustivelyAnnotatedProperties": [
{
"propertyId": "11",
"propertyName": "PLACE_OF_RESIDENCE",
"propertyDescription": "Describes the relationship between a person and the location where she/he lives/lived."
},
{
"propertyId": "12",
"propertyName": "PLACE_OF_BIRTH",
"propertyDescription": "Describes the relationship between a person and the location where she/he was born."
}
],
"facts": [
{
"factId": "8313:115:117:129:134:11",
"propertyId": "11",
"humanReadable": "<He> <PLACE_OF_RESIDENCE> <Brünn>",
"annotatedPassage": "<He> settled at <Brünn>, in Moravia, and lived about 1756.",
"subjectStart": 115,
"subjectEnd": 117,
"subjectText": "He",
"subjectUri": "http://www.wikidata.org/entity/Q19517888",
"objectStart": 129,
"objectEnd": 134,
"objectText": "Brünn",
"objectUri": "http://www.wikidata.org/entity/Q14960"
},
{
"factId": "8313:115:117:139:146:11",
"propertyId": "11",
"humanReadable": "<He> <PLACE_OF_RESIDENCE> <Moravia>",
"annotatedPassage": "<He> settled at Brünn, in <Moravia>, and lived about 1756.",
"subjectStart": 115,
"subjectEnd": 117,
"subjectText": "He",
"subjectUri": "http://www.wikidata.org/entity/Q19517888",
"objectStart": 139,
"objectEnd": 146,
"objectText": "Moravia",
"objectUri": "http://www.wikidata.org/entity/Q43266"
}
]
}
]
}
```
#### knet_re
- **Size of downloaded dataset files:** 12.59 MB
- **Size of the generated dataset:** 6.1 MB
An example of 'train' looks as follows:
```json
{
"documentId": "7",
"passageId": "7:23:206",
"factId": "7:23:44:138:160:1",
"passageText": "Tata Chemicals Europe (formerly Brunner Mond (UK) Limited) is a UK-based chemicals company that is a subsidiary of Tata Chemicals Limited, itself a part of the India-based Tata Group.",
"humanReadable": "<Tata Chemicals Europe> <SUBSIDIARY_OF> <Tata Chemicals Limited>",
"annotatedPassage": "<Tata Chemicals Europe> (formerly Brunner Mond (UK) Limited) is a UK-based chemicals company that is a subsidiary of <Tata Chemicals Limited>, itself a part of the India-based Tata Group.",
"subjectStart": 0,
"subjectEnd": 21,
"subjectText": "Tata Chemicals Europe",
"subjectType": 2,
"subjectUri": "",
"objectStart": 115,
"objectEnd": 137,
"objectText": "Tata Chemicals Limited",
"objectType": 2,
"objectUri": "http://www.wikidata.org/entity/Q2331365",
"relation": 13
}
```
#### knet_tokenized
- **Size of downloaded dataset files:** 12.59 MB
- **Size of the generated dataset:** 4.5 MB
An example of 'train' looks as follows:
```json
{
"doc_id": "7",
"passage_id": "7:23:206",
"fact_id": "7:162:168:183:205:1",
"tokens": ["Tata", "Chemicals", "Europe", "(", "formerly", "Brunner", "Mond", "(", "UK", ")", "Limited", ")", "is", "a", "UK", "-", "based", "chemicals", "company", "that", "is", "a", "subsidiary", "of", "Tata", "Chemicals", "Limited", ",", "itself", "a", "part", "of", "the", "India", "-", "based", "Tata", "Group", "."],
"subj_start": 28,
"subj_end": 29,
"subj_type": 2,
"subj_uri": "http://www.wikidata.org/entity/Q2331365",
"obj_start": 33,
"obj_end": 38,
"obj_type": 2,
"obj_uri": "http://www.wikidata.org/entity/Q331715",
"relation": 13
}
```
### Data Fields
#### knet
- `fold`: the fold, a `int` feature.
- `documentId`: the document id, a `string` feature.
- `source`: the source, a `string` feature.
- `documenText`: the document text, a `string` feature.
- `passages`: the list of passages, a `list` of `dict`.
- `passageId`: the passage id, a `string` feature.
- `passageStart`: the passage start, a `int` feature.
- `passageEnd`: the passage end, a `int` feature.
- `passageText`: the passage text, a `string` feature.
- `exhaustivelyAnnotatedProperties`: the list of exhaustively annotated properties, a `list` of `dict`.
- `propertyId`: the property id, a `string` feature.
- `propertyName`: the property name, a `string` feature.
- `propertyDescription`: the property description, a `string` feature.
- `facts`: the list of facts, a `list` of `dict`.
- `factId`: the fact id, a `string` feature.
- `propertyId`: the property id, a `string` feature.
- `humanReadable`: the human readable annotation, a `string` feature.
- `annotatedPassage`: the annotated passage, a `string` feature.
- `subjectStart`: the subject start, a `int` feature.
- `subjectEnd`: the subject end, a `int` feature.
- `subjectText`: the subject text, a `string` feature.
- `subjectUri`: the subject uri, a `string` feature.
- `objectStart`: the object start, a `int` feature.
- `objectEnd`: the object end, a `int` feature.
- `objectText`: the object text, a `string` feature.
- `objectUri`: the object uri, a `string` feature.
#### knet_re
- `documentId`: the document id, a `string` feature.
- `passageId`: the passage id, a `string` feature.
- `passageText`: the passage text, a `string` feature.
- `factId`: the fact id, a `string` feature.
- `humanReadable`: human-readable annotation, a `string` features.
- `annotatedPassage`: annotated passage, a `string` feature.
- `subjectStart`: the index of the start character of the relation subject mention, an `ìnt` feature.
- `subjectEnd`: the index of the end character of the relation subject mention, exclusive, an `ìnt` feature.
- `subjectText`: the text the subject mention, a `string` feature.
- `subjectType`: the NER type of the subject mention, a `string` classification label.
```json
{"O": 0, "PER": 1, "ORG": 2, "LOC": 3, "DATE": 4}
```
- `subjectUri`: the Wikidata URI of the subject mention, a `string` feature.
- `objectStart`: the index of the start character of the relation object mention, an `ìnt` feature.
- `objectEnd`: the index of the end character of the relation object mention, exclusive, an `ìnt` feature.
- `objectText`: the text the object mention, a `string` feature.
- `objectType`: the NER type of the object mention, a `string` classification label.
```json
{"O": 0, "PER": 1, "ORG": 2, "LOC": 3, "DATE": 4}
```
- `objectUri`: the Wikidata URI of the object mention, a `string` feature.
- `relation`: the relation label of this instance, a `string` classification label.
```json
{"NO_RELATION": 0, "DATE_OF_BIRTH": 1, "DATE_OF_DEATH": 2, "PLACE_OF_RESIDENCE": 3, "PLACE_OF_BIRTH": 4, "NATIONALITY": 5, "EMPLOYEE_OR_MEMBER_OF": 6, "EDUCATED_AT": 7, "POLITICAL_AFFILIATION": 8, "CHILD_OF": 9, "SPOUSE": 10, "DATE_FOUNDED": 11, "HEADQUARTERS": 12, "SUBSIDIARY_OF": 13, "FOUNDED_BY": 14, "CEO": 15}
```
#### knet_tokenized
- `doc_id`: the document id, a `string` feature.
- `passage_id`: the passage id, a `string` feature.
- `factId`: the fact id, a `string` feature.
- `tokens`: the list of tokens of this passage, obtained with spaCy, a `list` of `string` features.
- `subj_start`: the index of the start token of the relation subject mention, an `ìnt` feature.
- `subj_end`: the index of the end token of the relation subject mention, exclusive, an `ìnt` feature.
- `subj_type`: the NER type of the subject mention, a `string` classification label.
```json
{"O": 0, "PER": 1, "ORG": 2, "LOC": 3, "DATE": 4}
```
- `subj_uri`: the Wikidata URI of the subject mention, a `string` feature.
- `obj_start`: the index of the start token of the relation object mention, an `ìnt` feature.
- `obj_end`: the index of the end token of the relation object mention, exclusive, an `ìnt` feature.
- `obj_type`: the NER type of the object mention, a `string` classification label.
```json
{"O": 0, "PER": 1, "ORG": 2, "LOC": 3, "DATE": 4}
```
- `obj_uri`: the Wikidata URI of the object mention, a `string` feature.
- `relation`: the relation label of this instance, a `string` classification label.
```json
{"NO_RELATION": 0, "DATE_OF_BIRTH": 1, "DATE_OF_DEATH": 2, "PLACE_OF_RESIDENCE": 3, "PLACE_OF_BIRTH": 4, "NATIONALITY": 5, "EMPLOYEE_OR_MEMBER_OF": 6, "EDUCATED_AT": 7, "POLITICAL_AFFILIATION": 8, "CHILD_OF": 9, "SPOUSE": 10, "DATE_FOUNDED": 11, "HEADQUARTERS": 12, "SUBSIDIARY_OF": 13, "FOUNDED_BY": 14, "CEO": 15}
```
### Data Splits
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
are labeled as no_relation.
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{mesquita-etal-2019-knowledgenet,
title = "{K}nowledge{N}et: A Benchmark Dataset for Knowledge Base Population",
author = "Mesquita, Filipe and
Cannaviccio, Matteo and
Schmidek, Jordan and
Mirza, Paramita and
Barbosa, Denilson",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)",
month = nov,
year = "2019",
address = "Hong Kong, China",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/D19-1069",
doi = "10.18653/v1/D19-1069",
pages = "749--758",}
```
### Contributions
Thanks to [@phucdev](https://github.com/phucdev) for adding this dataset. |
autoevaluate/autoeval-staging-eval-project-ac4402f5-7985079 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- beans
eval_info:
task: image_multi_class_classification
model: nickmuchi/vit-base-beans
metrics: []
dataset_name: beans
dataset_config: default
dataset_split: test
col_mapping:
image: image
target: labels
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Image Classification
* Model: nickmuchi/vit-base-beans
* Dataset: beans
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
bipulai/skillate_helpdesk | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: cleaned_question
dtype: string
- name: cleaned_answer
dtype: string
splits:
- name: train
num_bytes: 617367
num_examples: 302
download_size: 236993
dataset_size: 617367
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "skillate_helpdesk"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_167 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 23174365392.125
num_examples: 241279
download_size: 20849659760
dataset_size: 23174365392.125
---
# Dataset Card for "chunk_167"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BobdoRock/AngelinaJolie | ---
license: openrail
---
|
bigscience-data/roots_eu_bsbasque | ---
language: eu
license: cc-by-sa-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_eu_bsbasque
# bsbasque
- Dataset uid: `bsbasque`
### Description
BSBasque dataset. The text is extracted from the following domains:
https://www.berria.eus
https://eu.wikipedia.org
https://goiena.eus
https://www.argia.eus
https://goierri.hitza.eus
### Homepage
### Licensing
CC BY-SA 4.0
### Speaker Locations
### Sizes
- 0.0877 % of total
- 53.9848 % of eu
### BigScience processing steps
#### Filters applied to: eu
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
|
open-llm-leaderboard/details_leveldevai__BeagleMist-7B | ---
pretty_name: Evaluation run of leveldevai/BeagleMist-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [leveldevai/BeagleMist-7B](https://huggingface.co/leveldevai/BeagleMist-7B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_leveldevai__BeagleMist-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-19T19:26:57.593325](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__BeagleMist-7B/blob/main/results_2024-01-19T19-26-57.593325.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6578372082846259,\n\
\ \"acc_stderr\": 0.03183820575158367,\n \"acc_norm\": 0.6576272237213145,\n\
\ \"acc_norm_stderr\": 0.03249584099098042,\n \"mc1\": 0.48225214198286415,\n\
\ \"mc1_stderr\": 0.01749247084307536,\n \"mc2\": 0.648345615677013,\n\
\ \"mc2_stderr\": 0.0152064953463137\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6791808873720137,\n \"acc_stderr\": 0.013640943091946528,\n\
\ \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393443\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6963752240589524,\n\
\ \"acc_stderr\": 0.004588827958775114,\n \"acc_norm\": 0.874726150169289,\n\
\ \"acc_norm_stderr\": 0.0033035264131234957\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.0373852067611967,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.0373852067611967\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n\
\ \"acc_stderr\": 0.03496101481191179,\n \"acc_norm\": 0.6994219653179191,\n\
\ \"acc_norm_stderr\": 0.03496101481191179\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.022755204959542946,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.022755204959542946\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8605504587155963,\n \"acc_stderr\": 0.014852421490033048,\n \"\
acc_norm\": 0.8605504587155963,\n \"acc_norm_stderr\": 0.014852421490033048\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156861,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156861\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.02531049537694486,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.02531049537694486\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n\
\ \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n\
\ \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n\
\ \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n\
\ \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n\
\ \"acc_stderr\": 0.01273492357953207,\n \"acc_norm\": 0.46284224250325945,\n\
\ \"acc_norm_stderr\": 0.01273492357953207\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.01879808628488689,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.01879808628488689\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.02826388994378459,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.02826388994378459\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48225214198286415,\n\
\ \"mc1_stderr\": 0.01749247084307536,\n \"mc2\": 0.648345615677013,\n\
\ \"mc2_stderr\": 0.0152064953463137\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.010814911009613988\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7187263078089462,\n \
\ \"acc_stderr\": 0.012384789310940243\n }\n}\n```"
repo_url: https://huggingface.co/leveldevai/BeagleMist-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|arc:challenge|25_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|gsm8k|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hellaswag|10_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T19-26-57.593325.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T19-26-57.593325.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- '**/details_harness|winogrande|5_2024-01-19T19-26-57.593325.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-19T19-26-57.593325.parquet'
- config_name: results
data_files:
- split: 2024_01_19T19_26_57.593325
path:
- results_2024-01-19T19-26-57.593325.parquet
- split: latest
path:
- results_2024-01-19T19-26-57.593325.parquet
---
# Dataset Card for Evaluation run of leveldevai/BeagleMist-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [leveldevai/BeagleMist-7B](https://huggingface.co/leveldevai/BeagleMist-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_leveldevai__BeagleMist-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-19T19:26:57.593325](https://huggingface.co/datasets/open-llm-leaderboard/details_leveldevai__BeagleMist-7B/blob/main/results_2024-01-19T19-26-57.593325.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6578372082846259,
"acc_stderr": 0.03183820575158367,
"acc_norm": 0.6576272237213145,
"acc_norm_stderr": 0.03249584099098042,
"mc1": 0.48225214198286415,
"mc1_stderr": 0.01749247084307536,
"mc2": 0.648345615677013,
"mc2_stderr": 0.0152064953463137
},
"harness|arc:challenge|25": {
"acc": 0.6791808873720137,
"acc_stderr": 0.013640943091946528,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393443
},
"harness|hellaswag|10": {
"acc": 0.6963752240589524,
"acc_stderr": 0.004588827958775114,
"acc_norm": 0.874726150169289,
"acc_norm_stderr": 0.0033035264131234957
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.0373852067611967,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.0373852067611967
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695255,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695255
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.03496101481191179,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.03496101481191179
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.022755204959542946,
"acc_norm": 0.8,
"acc_norm_stderr": 0.022755204959542946
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8605504587155963,
"acc_stderr": 0.014852421490033048,
"acc_norm": 0.8605504587155963,
"acc_norm_stderr": 0.014852421490033048
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156861,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156861
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.02531049537694486,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.02531049537694486
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41787709497206704,
"acc_stderr": 0.016495400635820084,
"acc_norm": 0.41787709497206704,
"acc_norm_stderr": 0.016495400635820084
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.01273492357953207,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.01273492357953207
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.01879808628488689,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.01879808628488689
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.02826388994378459,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.02826388994378459
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48225214198286415,
"mc1_stderr": 0.01749247084307536,
"mc2": 0.648345615677013,
"mc2_stderr": 0.0152064953463137
},
"harness|winogrande|5": {
"acc": 0.819258089976322,
"acc_stderr": 0.010814911009613988
},
"harness|gsm8k|5": {
"acc": 0.7187263078089462,
"acc_stderr": 0.012384789310940243
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Neel-Gupta/minipile-processed_3072 | ---
dataset_info:
features:
- name: text
sequence:
sequence:
sequence: int64
splits:
- name: train
num_bytes: 41564177600
num_examples: 1100
- name: test
num_bytes: 377856160
num_examples: 10
download_size: 4080693648
dataset_size: 41942033760
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-67ab09-31609144970 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: ARTeLab/it5-summarization-ilpost
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: ARTeLab/it5-summarization-ilpost
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@malar](https://huggingface.co/malar) for evaluating this model. |
open-llm-leaderboard/details_jeiku__Eros_Prodigadigm_7B | ---
pretty_name: Evaluation run of jeiku/Eros_Prodigadigm_7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jeiku/Eros_Prodigadigm_7B](https://huggingface.co/jeiku/Eros_Prodigadigm_7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jeiku__Eros_Prodigadigm_7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-23T07:32:28.592542](https://huggingface.co/datasets/open-llm-leaderboard/details_jeiku__Eros_Prodigadigm_7B/blob/main/results_2024-03-23T07-32-28.592542.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6327621655785706,\n\
\ \"acc_stderr\": 0.0326777929571602,\n \"acc_norm\": 0.6349676787099677,\n\
\ \"acc_norm_stderr\": 0.03334076539604183,\n \"mc1\": 0.5079559363525091,\n\
\ \"mc1_stderr\": 0.01750128507455182,\n \"mc2\": 0.6868028587570413,\n\
\ \"mc2_stderr\": 0.014924539614934326\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6416382252559727,\n \"acc_stderr\": 0.014012883334859859,\n\
\ \"acc_norm\": 0.6723549488054608,\n \"acc_norm_stderr\": 0.013715847940719337\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6804421429994025,\n\
\ \"acc_stderr\": 0.0046535230383693725,\n \"acc_norm\": 0.8563035251941844,\n\
\ \"acc_norm_stderr\": 0.00350064796787958\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062948,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062948\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997695,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997695\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031093,\n \"\
acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031093\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"\
acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.03031371053819889,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.03031371053819889\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \
\ \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206858,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206858\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203624,\n \"\
acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203624\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4681564245810056,\n\
\ \"acc_stderr\": 0.016688553415612213,\n \"acc_norm\": 0.4681564245810056,\n\
\ \"acc_norm_stderr\": 0.016688553415612213\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.02638527370346448,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.02638527370346448\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890155,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890155\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6339869281045751,\n \"acc_stderr\": 0.019488025745529682,\n \
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.019488025745529682\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417482,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417482\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5079559363525091,\n\
\ \"mc1_stderr\": 0.01750128507455182,\n \"mc2\": 0.6868028587570413,\n\
\ \"mc2_stderr\": 0.014924539614934326\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510429\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5329795299469295,\n \
\ \"acc_stderr\": 0.013742492794163423\n }\n}\n```"
repo_url: https://huggingface.co/jeiku/Eros_Prodigadigm_7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|arc:challenge|25_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|gsm8k|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hellaswag|10_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T07-32-28.592542.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-23T07-32-28.592542.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- '**/details_harness|winogrande|5_2024-03-23T07-32-28.592542.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-23T07-32-28.592542.parquet'
- config_name: results
data_files:
- split: 2024_03_23T07_32_28.592542
path:
- results_2024-03-23T07-32-28.592542.parquet
- split: latest
path:
- results_2024-03-23T07-32-28.592542.parquet
---
# Dataset Card for Evaluation run of jeiku/Eros_Prodigadigm_7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jeiku/Eros_Prodigadigm_7B](https://huggingface.co/jeiku/Eros_Prodigadigm_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jeiku__Eros_Prodigadigm_7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-23T07:32:28.592542](https://huggingface.co/datasets/open-llm-leaderboard/details_jeiku__Eros_Prodigadigm_7B/blob/main/results_2024-03-23T07-32-28.592542.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6327621655785706,
"acc_stderr": 0.0326777929571602,
"acc_norm": 0.6349676787099677,
"acc_norm_stderr": 0.03334076539604183,
"mc1": 0.5079559363525091,
"mc1_stderr": 0.01750128507455182,
"mc2": 0.6868028587570413,
"mc2_stderr": 0.014924539614934326
},
"harness|arc:challenge|25": {
"acc": 0.6416382252559727,
"acc_stderr": 0.014012883334859859,
"acc_norm": 0.6723549488054608,
"acc_norm_stderr": 0.013715847940719337
},
"harness|hellaswag|10": {
"acc": 0.6804421429994025,
"acc_stderr": 0.0046535230383693725,
"acc_norm": 0.8563035251941844,
"acc_norm_stderr": 0.00350064796787958
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062948,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062948
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997695,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997695
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031093,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031093
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.03031371053819889,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.03031371053819889
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206858,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206858
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.016785481159203624,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.016785481159203624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4681564245810056,
"acc_stderr": 0.016688553415612213,
"acc_norm": 0.4681564245810056,
"acc_norm_stderr": 0.016688553415612213
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.02638527370346448,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.02638527370346448
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890155,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890155
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.019488025745529682,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.019488025745529682
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417482,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417482
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333047,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333047
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5079559363525091,
"mc1_stderr": 0.01750128507455182,
"mc2": 0.6868028587570413,
"mc2_stderr": 0.014924539614934326
},
"harness|winogrande|5": {
"acc": 0.8089976322020521,
"acc_stderr": 0.011047808761510429
},
"harness|gsm8k|5": {
"acc": 0.5329795299469295,
"acc_stderr": 0.013742492794163423
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Sao10K__Ana-v1-m7 | ---
pretty_name: Evaluation run of Sao10K/Ana-v1-m7
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Ana-v1-m7](https://huggingface.co/Sao10K/Ana-v1-m7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Ana-v1-m7\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-13T11:00:42.385261](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Ana-v1-m7/blob/main/results_2023-12-13T11-00-42.385261.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6450876976103553,\n\
\ \"acc_stderr\": 0.032232788125298194,\n \"acc_norm\": 0.6484944592637535,\n\
\ \"acc_norm_stderr\": 0.032869268033759086,\n \"mc1\": 0.39657282741738065,\n\
\ \"mc1_stderr\": 0.017124930942023515,\n \"mc2\": 0.5503162010975955,\n\
\ \"mc2_stderr\": 0.01574450133768535\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6407849829351536,\n \"acc_stderr\": 0.014020224155839152,\n\
\ \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.013697432466693247\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6762597092212707,\n\
\ \"acc_stderr\": 0.0046694598919176915,\n \"acc_norm\": 0.859788886675961,\n\
\ \"acc_norm_stderr\": 0.0034649633793799287\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.0253795249107784,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.0253795249107784\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188712,\n \"\
acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188712\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\"\
: 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6230769230769231,\n \"acc_stderr\": 0.024570975364225995,\n\
\ \"acc_norm\": 0.6230769230769231,\n \"acc_norm_stderr\": 0.024570975364225995\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524565,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524565\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.034063153607115065,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.034063153607115065\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.013468201614066302,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.013468201614066302\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35977653631284917,\n\
\ \"acc_stderr\": 0.016051419760310263,\n \"acc_norm\": 0.35977653631284917,\n\
\ \"acc_norm_stderr\": 0.016051419760310263\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"\
acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n\
\ \"acc_stderr\": 0.012736153390214961,\n \"acc_norm\": 0.4634941329856584,\n\
\ \"acc_norm_stderr\": 0.012736153390214961\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291286,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291286\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39657282741738065,\n\
\ \"mc1_stderr\": 0.017124930942023515,\n \"mc2\": 0.5503162010975955,\n\
\ \"mc2_stderr\": 0.01574450133768535\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.011631268360607778\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5253980288097043,\n \
\ \"acc_stderr\": 0.01375470508911231\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Ana-v1-m7
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|arc:challenge|25_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|gsm8k|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hellaswag|10_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T11-00-42.385261.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T11-00-42.385261.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- '**/details_harness|winogrande|5_2023-12-13T11-00-42.385261.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-13T11-00-42.385261.parquet'
- config_name: results
data_files:
- split: 2023_12_13T11_00_42.385261
path:
- results_2023-12-13T11-00-42.385261.parquet
- split: latest
path:
- results_2023-12-13T11-00-42.385261.parquet
---
# Dataset Card for Evaluation run of Sao10K/Ana-v1-m7
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sao10K/Ana-v1-m7](https://huggingface.co/Sao10K/Ana-v1-m7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Ana-v1-m7",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T11:00:42.385261](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Ana-v1-m7/blob/main/results_2023-12-13T11-00-42.385261.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6450876976103553,
"acc_stderr": 0.032232788125298194,
"acc_norm": 0.6484944592637535,
"acc_norm_stderr": 0.032869268033759086,
"mc1": 0.39657282741738065,
"mc1_stderr": 0.017124930942023515,
"mc2": 0.5503162010975955,
"mc2_stderr": 0.01574450133768535
},
"harness|arc:challenge|25": {
"acc": 0.6407849829351536,
"acc_stderr": 0.014020224155839152,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.013697432466693247
},
"harness|hellaswag|10": {
"acc": 0.6762597092212707,
"acc_stderr": 0.0046694598919176915,
"acc_norm": 0.859788886675961,
"acc_norm_stderr": 0.0034649633793799287
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.0253795249107784,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.0253795249107784
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188712,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188712
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6230769230769231,
"acc_stderr": 0.024570975364225995,
"acc_norm": 0.6230769230769231,
"acc_norm_stderr": 0.024570975364225995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524565,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524565
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.034063153607115065,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.034063153607115065
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066302,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35977653631284917,
"acc_stderr": 0.016051419760310263,
"acc_norm": 0.35977653631284917,
"acc_norm_stderr": 0.016051419760310263
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292452,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214961,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214961
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291286,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291286
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39657282741738065,
"mc1_stderr": 0.017124930942023515,
"mc2": 0.5503162010975955,
"mc2_stderr": 0.01574450133768535
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.011631268360607778
},
"harness|gsm8k|5": {
"acc": 0.5253980288097043,
"acc_stderr": 0.01375470508911231
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Zubairjamu/mydata | ---
license: cc-by-3.0
---
|
AdapterOcean/med_alpaca_standardized_cluster_51_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 29382535
num_examples: 16000
download_size: 14413180
dataset_size: 29382535
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_51_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T21:40:15.944875](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down/blob/main/results_2023-10-25T21-40-15.944875.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.07697147651006711,\n\
\ \"em_stderr\": 0.002729682408788614,\n \"f1\": 0.12191170302013389,\n\
\ \"f1_stderr\": 0.0028589398116221384,\n \"acc\": 0.44546584943629414,\n\
\ \"acc_stderr\": 0.01035635936441261\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.07697147651006711,\n \"em_stderr\": 0.002729682408788614,\n\
\ \"f1\": 0.12191170302013389,\n \"f1_stderr\": 0.0028589398116221384\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11902956785443518,\n \
\ \"acc_stderr\": 0.00891970291116163\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663592\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|arc:challenge|25_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T21_40_15.944875
path:
- '**/details_harness|drop|3_2023-10-25T21-40-15.944875.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T21-40-15.944875.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T21_40_15.944875
path:
- '**/details_harness|gsm8k|5_2023-10-25T21-40-15.944875.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T21-40-15.944875.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hellaswag|10_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-20-42.158103.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T10-20-42.158103.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T10-20-42.158103.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T21_40_15.944875
path:
- '**/details_harness|winogrande|5_2023-10-25T21-40-15.944875.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T21-40-15.944875.parquet'
- config_name: results
data_files:
- split: 2023_10_10T10_20_42.158103
path:
- results_2023-10-10T10-20-42.158103.parquet
- split: 2023_10_25T21_40_15.944875
path:
- results_2023-10-25T21-40-15.944875.parquet
- split: latest
path:
- results_2023-10-25T21-40-15.944875.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T21:40:15.944875](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE5_4w-r8-q_k_v_o_gate_up_down/blob/main/results_2023-10-25T21-40-15.944875.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.07697147651006711,
"em_stderr": 0.002729682408788614,
"f1": 0.12191170302013389,
"f1_stderr": 0.0028589398116221384,
"acc": 0.44546584943629414,
"acc_stderr": 0.01035635936441261
},
"harness|drop|3": {
"em": 0.07697147651006711,
"em_stderr": 0.002729682408788614,
"f1": 0.12191170302013389,
"f1_stderr": 0.0028589398116221384
},
"harness|gsm8k|5": {
"acc": 0.11902956785443518,
"acc_stderr": 0.00891970291116163
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.011793015817663592
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
xiaohan2023/tulu-v2-sft-mix | ---
license: mit
---
|
HoangHa/processed_lima | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 2850866
num_examples: 1030
download_size: 1679958
dataset_size: 2850866
---
# Dataset Card for "processed_lima"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TheTung/squad_es_v2 | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
language:
- es
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|squad
task_categories:
- question-answering
task_ids:
- extractive-qa
paperswithcode_id: squad-es
pretty_name: SQuAD-es
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
config_name: v1.1.0
splits:
- name: train
num_bytes: 83680438
num_examples: 87595
- name: validation
num_bytes: 10955800
num_examples: 10570
download_size: 39291362
dataset_size: 94636238
---
# Dataset Card for "squad_es"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/ccasimiro88/TranslateAlignRetrieve](https://github.com/ccasimiro88/TranslateAlignRetrieve)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 39.29 MB
- **Size of the generated dataset:** 94.63 MB
- **Total amount of disk used:** 133.92 MB
### Dataset Summary
Automatic translation of the Stanford Question Answering Dataset (SQuAD) v2 into Spanish
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### v1.1.0
- **Size of downloaded dataset files:** 39.29 MB
- **Size of the generated dataset:** 94.63 MB
- **Total amount of disk used:** 133.92 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"answers": {
"answer_start": [404, 356, 356],
"text": ["Santa Clara, California", "Levi 's Stadium", "Levi 's Stadium en la Bahía de San Francisco en Santa Clara, California."]
},
"context": "\"El Super Bowl 50 fue un partido de fútbol americano para determinar al campeón de la NFL para la temporada 2015. El campeón de ...",
"id": "56be4db0acb8001400a502ee",
"question": "¿Dónde tuvo lugar el Super Bowl 50?",
"title": "Super Bowl _ 50"
}
```
### Data Fields
The data fields are the same among all splits.
#### v1.1.0
- `id`: a `string` feature.
- `title`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
### Data Splits
| name |train|validation|
|------|----:|---------:|
|v1.1.0|87595| 10570|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
The SQuAD-es dataset is licensed under the [CC BY 4.0](https://creativecommons.org/licenses/by/4.0/) license.
### Citation Information
```
@article{2016arXiv160605250R,
author = {Casimiro Pio , Carrino and Marta R. , Costa-jussa and Jose A. R. , Fonollosa},
title = "{Automatic Spanish Translation of the SQuAD Dataset for Multilingual
Question Answering}",
journal = {arXiv e-prints},
year = 2019,
eid = {arXiv:1912.05200v1},
pages = {arXiv:1912.05200v1},
archivePrefix = {arXiv},
eprint = {1912.05200v2},
}
```
license: mit
task_categories:
- question-answering
language:
- es
size_categories:
- 10K<n<100K
--- |
galman33/gal_yair_83000_256x256_fixed | ---
dataset_info:
features:
- name: lat
dtype: float64
- name: lon
dtype: float64
- name: country_code
dtype:
class_label:
names:
'0': ad
'1': ae
'2': al
'3': aq
'4': ar
'5': au
'6': bd
'7': be
'8': bg
'9': bm
'10': bo
'11': br
'12': bt
'13': bw
'14': ca
'15': ch
'16': cl
'17': co
'18': cz
'19': de
'20': dk
'21': ec
'22': ee
'23': es
'24': fi
'25': fr
'26': gb
'27': gh
'28': gl
'29': gr
'30': gt
'31': hk
'32': hr
'33': hu
'34': id
'35': ie
'36': il
'37': is
'38': it
'39': ix
'40': jp
'41': kg
'42': kh
'43': kr
'44': la
'45': lk
'46': ls
'47': lt
'48': lu
'49': lv
'50': me
'51': mg
'52': mk
'53': mn
'54': mo
'55': mt
'56': mx
'57': my
'58': nl
'59': 'no'
'60': nz
'61': pe
'62': ph
'63': pl
'64': pt
'65': ro
'66': rs
'67': ru
'68': se
'69': sg
'70': si
'71': sk
'72': sn
'73': sz
'74': th
'75': tn
'76': tr
'77': tw
'78': ua
'79': ug
'80': us
'81': uy
'82': za
- name: image
dtype: image
splits:
- name: train
num_bytes: 8075723633.0
num_examples: 83000
download_size: 8055991198
dataset_size: 8075723633.0
---
# Dataset Card for "gal_yair_83000_256x256_fixed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_42_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 11521964
num_examples: 21231
download_size: 6120586
dataset_size: 11521964
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_42_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Maxx0/DatasetProfy | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.0-7B | ---
pretty_name: Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B](https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.0-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T17:59:18.672226](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.0-7B/blob/main/results_2023-10-28T17-59-18.672226.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.007655201342281879,\n\
\ \"em_stderr\": 0.0008925843316825968,\n \"f1\": 0.06762374161073832,\n\
\ \"f1_stderr\": 0.0015672145775403328,\n \"acc\": 0.48594340621826704,\n\
\ \"acc_stderr\": 0.011102174081480334\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.007655201342281879,\n \"em_stderr\": 0.0008925843316825968,\n\
\ \"f1\": 0.06762374161073832,\n \"f1_stderr\": 0.0015672145775403328\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18498862774829417,\n \
\ \"acc_stderr\": 0.010695390472237908\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.01150895769072276\n\
\ }\n}\n```"
repo_url: https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|arc:challenge|25_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T17_59_18.672226
path:
- '**/details_harness|drop|3_2023-10-28T17-59-18.672226.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T17-59-18.672226.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T17_59_18.672226
path:
- '**/details_harness|gsm8k|5_2023-10-28T17-59-18.672226.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T17-59-18.672226.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hellaswag|10_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T13-03-57.822479.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T13-03-57.822479.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T13-03-57.822479.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T17_59_18.672226
path:
- '**/details_harness|winogrande|5_2023-10-28T17-59-18.672226.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T17-59-18.672226.parquet'
- config_name: results
data_files:
- split: 2023_10_09T13_03_57.822479
path:
- results_2023-10-09T13-03-57.822479.parquet
- split: 2023_10_28T17_59_18.672226
path:
- results_2023-10-28T17-59-18.672226.parquet
- split: latest
path:
- results_2023-10-28T17-59-18.672226.parquet
---
# Dataset Card for Evaluation run of PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B](https://huggingface.co/PeanutJar/Mistral-v0.1-PeanutButter-v0.0.0-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.0-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T17:59:18.672226](https://huggingface.co/datasets/open-llm-leaderboard/details_PeanutJar__Mistral-v0.1-PeanutButter-v0.0.0-7B/blob/main/results_2023-10-28T17-59-18.672226.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.007655201342281879,
"em_stderr": 0.0008925843316825968,
"f1": 0.06762374161073832,
"f1_stderr": 0.0015672145775403328,
"acc": 0.48594340621826704,
"acc_stderr": 0.011102174081480334
},
"harness|drop|3": {
"em": 0.007655201342281879,
"em_stderr": 0.0008925843316825968,
"f1": 0.06762374161073832,
"f1_stderr": 0.0015672145775403328
},
"harness|gsm8k|5": {
"acc": 0.18498862774829417,
"acc_stderr": 0.010695390472237908
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.01150895769072276
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
averageandyyy/part_1_imda_10000 | ---
dataset_info:
features:
- name: transcript
dtype: string
- name: path
dtype: string
- name: waveform
sequence: float64
splits:
- name: train
num_bytes: 6702801413.012406
num_examples: 10000
download_size: 1615604216
dataset_size: 6702801413.012406
---
# Dataset Card for "part_1_imda_10000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-66000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 971922
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
manishiitg/teknium-GPTeacher-General-Instruct | ---
dataset_info:
features:
- name: system
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
- name: lang
dtype: string
splits:
- name: train
num_bytes: 209818632
num_examples: 178520
download_size: 99430639
dataset_size: 209818632
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Miniex/Bulmavoz2 | ---
license: openrail
---
|
AlexanderLJX/Dining-Insights | ---
license: mit
---
|
jahb57/gpt2_embeddings_BATCH_5 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: last_hidden_state
sequence:
sequence: float32
splits:
- name: train
num_bytes: 18223400336
num_examples: 100000
download_size: 18270602271
dataset_size: 18223400336
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Muennighoff/xstory_cloze | ---
annotations_creators:
- found
language_creators:
- found
language:
- ar
- es
- eu
- hi
- id
- zh
- ru
- my
license:
- unknown
multilinguality:
- multilingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_ids: []
tags:
- other-story-completion
---
# Dataset Card for "story_cloze"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
### Dataset Summary
Story Cloze Test' is a new commonsense reasoning framework for evaluating story understanding,
story generation, and script learning.This test requires a system to choose the correct ending
to a four-sentence story.
### Data Instances
- **Size of downloaded dataset files:** 2.03 MB
- **Size of the generated dataset:** 2.03 MB
- **Total amount of disk used:** 2.05 MB
An example of 'train' looks as follows.
```
{'answer_right_ending': 1,
'input_sentence_1': 'Rick grew up in a troubled household.',
'input_sentence_2': 'He never found good support in family, and turned to gangs.',
'input_sentence_3': "It wasn't long before Rick got shot in a robbery.",
'input_sentence_4': 'The incident caused him to turn a new leaf.',
'sentence_quiz1': 'He is happy now.',
'sentence_quiz2': 'He joined a gang.',
'story_id': '138d5bfb-05cc-41e3-bf2c-fa85ebad14e2'}
```
### Data Fields
The data fields are the same among all splits.
- `input_sentence_1`: The first statement in the story.
- `input_sentence_2`: The second statement in the story.
- `input_sentence_3`: The third statement in the story.
- `input_sentence_4`: The forth statement in the story.
- `sentence_quiz1`: first possible continuation of the story.
- `sentence_quiz2`: second possible continuation of the story.
- `answer_right_ending`: correct possible ending; either 1 or 2.
- `story_id`: story id.
### Data Splits
| name |validation |test|
|-------|-----:|---:|
|lang|1871|1871|
|
chirunder/MSCS_40_page | ---
dataset_info:
features:
- name: html
dtype: string
splits:
- name: train
num_bytes: 6973933
num_examples: 40
download_size: 1637020
dataset_size: 6973933
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "MSCS_40_page"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chathuru/SplunkAttackRangeAlerts-v2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 157136
num_examples: 536
- name: test
num_bytes: 37310
num_examples: 134
download_size: 43233
dataset_size: 194446
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/asahi_takiguchi_mahoushoujosite | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Asahi Takiguchi
This is the dataset of Asahi Takiguchi, containing 21 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 21 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 46 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 21 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 21 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 21 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 21 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 21 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 46 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 46 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 46 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
JoinDatawithme/Humanface_of_various_age_groups | ---
license: apache-2.0
language:
- en
tags:
- face
- humanface
- ml
- face recognition
- Generated data
- dataset
size_categories:
- 100K<n<1M
---
📚 Dataset Introduction
· This dataset is constructed with generated data and contains a total of 1000 human face images with a resolution of 1024x1024.
· The filenames are the corresponding labels for the images, formatted as "Number"-"Gender"-"Age".
· The dataset includes 500 images of each gender, distributed across different age groups (1-10, 11-20, 21-30, 31-40, over 40) with 100 images for each group.
This dataset is a sample of 1000 images from the full collection.
For more data inquiries, feel free to contact us 😄
🤖 About Us: We are developers from China, dedicated to accelerating AI development with high-quality data.
📮 Contact information: huawuque@join-intelligence.com |
CyberHarem/beatrice_rezero | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of beatrice (Re:Zero Kara Hajimeru Isekai Seikatsu)
This is the dataset of beatrice (Re:Zero Kara Hajimeru Isekai Seikatsu), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
joey234/mmlu-anatomy-neg-answer | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_answer
dtype: string
splits:
- name: test
num_bytes: 38830
num_examples: 135
download_size: 24408
dataset_size: 38830
---
# Dataset Card for "mmlu-anatomy-neg-answer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NoahMartinezXiang/CREMA-D | ---
license: apache-2.0
---
|
HugAda/ada-sokol-style | ---
license: afl-3.0
---
|
contemmcm/amazon_reviews_2013 | ---
task_categories:
- text-classification
configs:
- config_name: all
data_files:
- split: train
path: "amazon_reviews_2013/all/train-*.parquet"
- split: test
path: "amazon_reviews_2013/all/test-*.parquet"
default: true
- config_name: amazon-instant-video
data_files:
- split: train
path: "amazon_reviews_2013/amazon-instant-video/train-*.parquet"
- split: test
path: "amazon_reviews_2013/amazon-instant-video/test-*.parquet"
- config_name: arts
data_files:
- split: train
path: "amazon_reviews_2013/arts/train-*.parquet"
- split: test
path: "amazon_reviews_2013/arts/test-*.parquet"
- config_name: automotive
data_files:
- split: train
path: "amazon_reviews_2013/automotive/train-*.parquet"
- split: test
path: "amazon_reviews_2013/automotive/test-*.parquet"
- config_name: baby
data_files:
- split: train
path: "amazon_reviews_2013/baby/train-*.parquet"
- split: test
path: "amazon_reviews_2013/baby/test-*.parquet"
- config_name: beauty
data_files:
- split: train
path: "amazon_reviews_2013/beauty/train-*.parquet"
- split: test
path: "amazon_reviews_2013/beauty/test-*.parquet"
- config_name: book
data_files:
- split: train
path: "amazon_reviews_2013/book/train-*.parquet"
- split: test
path: "amazon_reviews_2013/book/test-*.parquet"
- config_name: cell-phone
data_files:
- split: train
path: "amazon_reviews_2013/cell-phone/train-*.parquet"
- split: test
path: "amazon_reviews_2013/cell-phone/test-*.parquet"
- config_name: clothing
data_files:
- split: train
path: "amazon_reviews_2013/clothing/train-*.parquet"
- split: test
path: "amazon_reviews_2013/clothing/test-*.parquet"
- config_name: electronics
data_files:
- split: train
path: "amazon_reviews_2013/electronics/train-*.parquet"
- split: test
path: "amazon_reviews_2013/electronics/test-*.parquet"
- config_name: gourmet-food
data_files:
- split: train
path: "amazon_reviews_2013/gourmet-food/train-*.parquet"
- split: test
path: "amazon_reviews_2013/gourmet-food/test-*.parquet"
- config_name: health
data_files:
- split: train
path: "amazon_reviews_2013/health/train-*.parquet"
- split: test
path: "amazon_reviews_2013/health/test-*.parquet"
- config_name: home-kitchen
data_files:
- split: train
path: "amazon_reviews_2013/home-kitchen/train-*.parquet"
- split: test
path: "amazon_reviews_2013/home-kitchen/test-*.parquet"
- config_name: industrial-scientific
data_files:
- split: train
path: "amazon_reviews_2013/industrial-scientific/train-*.parquet"
- split: test
path: "amazon_reviews_2013/industrial-scientific/test-*.parquet"
- config_name: jewelry
data_files:
- split: train
path: "amazon_reviews_2013/jewelry/train-*.parquet"
- split: test
path: "amazon_reviews_2013/jewelry/test-*.parquet"
- config_name: kindle-store
data_files:
- split: train
path: "amazon_reviews_2013/kindle-store/train-*.parquet"
- split: test
path: "amazon_reviews_2013/kindle-store/test-*.parquet"
- config_name: movie-tv
data_files:
- split: train
path: "amazon_reviews_2013/movie-tv/train-*.parquet"
- split: test
path: "amazon_reviews_2013/movie-tv/test-*.parquet"
- config_name: music
data_files:
- split: train
path: "amazon_reviews_2013/music/train-*.parquet"
- split: test
path: "amazon_reviews_2013/music/test-*.parquet"
- config_name: musical-instrument
data_files:
- split: train
path: "amazon_reviews_2013/musical-instrument/train-*.parquet"
- split: test
path: "amazon_reviews_2013/musical-instrument/test-*.parquet"
- config_name: office
data_files:
- split: train
path: "amazon_reviews_2013/office/train-*.parquet"
- split: test
path: "amazon_reviews_2013/office/test-*.parquet"
- config_name: patio
data_files:
- split: train
path: "amazon_reviews_2013/patio/train-*.parquet"
- split: test
path: "amazon_reviews_2013/patio/test-*.parquet"
- config_name: pet-supply
data_files:
- split: train
path: "amazon_reviews_2013/pet-supply/train-*.parquet"
- split: test
path: "amazon_reviews_2013/pet-supply/test-*.parquet"
- config_name: shoe
data_files:
- split: train
path: "amazon_reviews_2013/shoe/train-*.parquet"
- split: test
path: "amazon_reviews_2013/shoe/test-*.parquet"
- config_name: software
data_files:
- split: train
path: "amazon_reviews_2013/software/train-*.parquet"
- split: test
path: "amazon_reviews_2013/software/test-*.parquet"
- config_name: sports-outdoor
data_files:
- split: train
path: "amazon_reviews_2013/sports-outdoor/train-*.parquet"
- split: test
path: "amazon_reviews_2013/sports-outdoor/test-*.parquet"
- config_name: tools-home-improvement
data_files:
- split: train
path: "amazon_reviews_2013/tools-home-improvement/train-*.parquet"
- split: test
path: "amazon_reviews_2013/tools-home-improvement/test-*.parquet"
- config_name: toy-game
data_files:
- split: train
path: "amazon_reviews_2013/toy-game/train-*.parquet"
- split: test
path: "amazon_reviews_2013/toy-game/test-*.parquet"
- config_name: video-game
data_files:
- split: train
path: "amazon_reviews_2013/video-game/train-*.parquet"
- split: test
path: "amazon_reviews_2013/video-game/test-*.parquet"
- config_name: all
data_files:
- split: train
path: "amazon_reviews_2013/watch/train-*.parquet"
- split: test
path: "amazon_reviews_2013/watch/test-*.parquet"
dataset_info:
features:
- name: product/title
dtype: string
- name: product/price
dtype: string
- name: review/helpfulness
dtype: string
- name: review/score
dtype:
class_label:
names:
'0': 1 star
'1': 2 stars
'2': 3 stars
'3': 4 stars
'4': 5 stars
- name: review/time
dtype: int64
- name: review/summary
dtype: string
- name: review/text
dtype: string
- name: product/category
dtype:
class_label:
names:
'0': Amazon Instant Video
'1': Arts
'2': Automotive
'3': Baby
'4': Beauty
'5': Book
'6': Cell Phone
'7': Clothing
'8': Electronics
'9': Gourmet Food
'10': Health
'11': Home & Kitchen
'12': Industrial & Scientific
'13': Jewelry
'14': Kindle Store
'15': Movie & TV
'16': Musical Instrument
'17': Music
'18': Office
'19': Patio
'20': Pet Supply
'21': Shoe
'22': Software
'23': Sports & Outdoor
'24': Tools & Home Improvement
'25': Toy & Game
'26': Video Game
'27': Watch
- name: review/helpfulness_ratio
dtype: float64
- name: review/helpfulness_total_votes
dtype: int64
--- |
satyadewangan/processed_open_orca | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 9287922
num_examples: 4585
download_size: 5190003
dataset_size: 9287922
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kristinashemet/Instruction_Input_dataset_07_04 | ---
dataset_info:
features:
- name: Instruction
dtype: string
- name: Input
dtype: string
splits:
- name: train
num_bytes: 297331
num_examples: 29
download_size: 141941
dataset_size: 297331
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dongyoung4091/shp-generated_flan_t5_large_flan_t5_zeroshot | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: zeroshot_helpfulness
dtype: float64
- name: zeroshot_specificity
dtype: float64
- name: zeroshot_intent
dtype: float64
- name: zeroshot_factuality
dtype: float64
- name: zeroshot_easy-to-understand
dtype: int64
- name: zeroshot_relevance
dtype: int64
- name: zeroshot_readability
dtype: int64
- name: zeroshot_enough-detail
dtype: int64
- name: 'zeroshot_biased:'
dtype: float64
- name: zeroshot_fail-to-consider-individual-preferences
dtype: float64
- name: zeroshot_repetetive
dtype: float64
- name: zeroshot_fail-to-consider-context
dtype: float64
- name: zeroshot_too-long
dtype: int64
splits:
- name: train
num_bytes: 29519465
num_examples: 25600
download_size: 1900231
dataset_size: 29519465
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "shp-generated_flan_t5_large_flan_t5_zeroshot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
argmaxinc/whisperkit-evals_01-30-24 |
---
pretty_name: "WhisperKit ASR Evaluation Results"
tags:
- whisper
- whisperkit
- coreml
- asr
- quantized
---
# WhisperKit Evaluation Results
## Dataset: `librispeech`
### WhisperKit + `openai_whisper-large-v3` (+optimized variants)
| | WER | QoI (%) | File Size (MB) |
|:----------------------------------------------------------------------------------------------------------------------------------------------|------:|----------:|-----------------:|
| [openai_whisper-large-v3](https://huggingface.co/argmaxinc/whisperkit-coreml-rc1/tree/main/openai_whisper-large-v3) | 2.44 | 100 | 3100 |
| [openai_whisper-large-v3_turbo](https://huggingface.co/argmaxinc/whisperkit-coreml-rc1/tree/main/openai_whisper-large-v3_turbo) | 2.41 | 99.8 | 3100 |
| [openai_whisper-large-v3_turbo_1307MB](https://huggingface.co/argmaxinc/whisperkit-coreml-rc1/tree/main/openai_whisper-large-v3_turbo_1307MB) | 2.6 | 97.7 | 1307 |
| [openai_whisper-large-v3_turbo_1049MB](https://huggingface.co/argmaxinc/whisperkit-coreml-rc1/tree/main/openai_whisper-large-v3_turbo_1049MB) | 4.81 | 91 | 1049 |
| [openai_whisper-large-v3_1053MB](https://huggingface.co/argmaxinc/whisperkit-coreml-rc1/tree/main/openai_whisper-large-v3_1053MB) | 4.65 | 90.8 | 1053 |
### Different Projects + `openai_whisper-large-v3`
| | WER | Commit Hash | Model Format |
|:-------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------|:--------------|:---------------|
| [WhisperKit](https://github.com/argmaxinc/whisperkit) | [2.44](https://hf.co/datasets/argmaxinc/whisperkit-evals-rc1/tree/main/WhisperKit/openai_whisper-large-v3/librispeech) | 0f8b4fe | Core ML |
| [WhisperCpp](https://github.com/ggerganov/whisper.cpp) | [2.36](https://hf.co/datasets/argmaxinc/whisperkit-evals-rc1/tree/main/whisper.cpp/openai_whisper-large-v3/librispeech) | e72e415 | Core ML + GGUF |
| [WhisperMLX](https://github.com/ml-explore/mlx-examples/blob/main/whisper/whisper/transcribe.py) | [2.69](https://hf.co/datasets/argmaxinc/whisperkit-evals-rc1/tree/main/WhisperMLX/openai_whisper-large-v3/librispeech) | 614de66 | MLX (Numpy) |
### Quality-of-Inference (QoI) Certification
We believe that rigorously measuring the quality of inference is necessary for developers and
enterprises to make informed decisions when opting to use optimized or compressed variants of
Whisper models in production. The current measurements are between reference and optimized
WhisperKit models. We are going to extend the scope of this measurement to other Whisper
implementations soon so developers can certify the behavior change (if any) caused by
alternating use of WhisperKit with (or migration from) these implementations.
In all measurements, we care primarily about per-example no-regressions (quantified as `qoi` below)
which is a stricter metric compared to dataset average WER. A 100% `qoi` preserves perfect
backwards-compatibility on the test distribution and avoids "perceived regressions", the phenomenon
where per-example known behavior changes after a code/model update and causes divergence in
downstream code or breaks the user experience itself (even if dataset averages might stay flat
across updates). Pseudocode for `qoi`:
```python
qoi = []
for example in dataset:
no_regression = wer(optimized_model(example)) <= wer(reference_model(example))
qoi.append(no_regression)
qoi = (sum(qoi) / len(qoi)) * 100.
```
We define the reference model as the default float16 precision Core ML model that is generated by
whisperkittools. This reference model matches the accuracy of the original PyTorch model
on the specified test sets. We use `librispeech/test.clean` (5 hours of short English audio clips)
as our testing set for Whisper. We are actively expanding our test set coverage to `earnings22`
(120 hours of long English audio clips with various accents). We anticipate developers that use Whisper in production to have
their own Quality Assurance test sets and whisperkittools offers the tooling necessary to run the
same measurements on such custom test sets, please see the [Model Evaluation on Custom Dataset](#evaluate-on-custom-dataset)
for details.
### Reproducing Results
Results in this page are generated by our cluster of Apple Silicon Macs. We use them as self-hosted runners on
Github Actions as our CI infrastructure. Due to [security concerns](https://docs.github.com/en/actions/security-guides/security-hardening-for-github-actions#hardening-for-self-hosted-runners),
we are unable to open up the cluster to the public. However, any Apple Silicon Mac (even with 8GB RAM) can be used to
run identical [evaluation jobs](#evaluation)
locally. For reference, our M2 Ultra devices complete a `librispeech` + `openai/whisper-large-v3`
evaluation in under 1 hour regardless of the Whisper implementation. Older Apple Silicon Macs should take less than
1 day to complete the same evaluation.
Glossary:
- `_turbo`: Indicates the presence of additional optimizations (not compression) to unlock streaming transcription
as described in our [Blog Post](https://www.takeargmax.com/blog/whisperkit).
- `_*MB`: Indicates the presence of mixed-bit quantization. Instead of cluttering the filename with details like
`_AudioEncoder-5.8bits_TextDecoder-6.1bits`, we choose to summarize the compression spec as the resulting total file size since this is what matters to developers in production.
|
allennghayoui/code_assistant_dataset | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_sarahlintang__mistral-indo-7b | ---
pretty_name: Evaluation run of sarahlintang/mistral-indo-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sarahlintang/mistral-indo-7b](https://huggingface.co/sarahlintang/mistral-indo-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sarahlintang__mistral-indo-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-03T21:47:22.078031](https://huggingface.co/datasets/open-llm-leaderboard/details_sarahlintang__mistral-indo-7b/blob/main/results_2024-02-03T21-47-22.078031.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6261916134620902,\n\
\ \"acc_stderr\": 0.032466151879964746,\n \"acc_norm\": 0.6326766330349893,\n\
\ \"acc_norm_stderr\": 0.033132950603814465,\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.423359772241798,\n\
\ \"mc2_stderr\": 0.014387833736527744\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5725255972696246,\n \"acc_stderr\": 0.014456862944650647,\n\
\ \"acc_norm\": 0.6109215017064846,\n \"acc_norm_stderr\": 0.014247309976045607\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6156144194383589,\n\
\ \"acc_stderr\": 0.0048545552940175585,\n \"acc_norm\": 0.8118900617406891,\n\
\ \"acc_norm_stderr\": 0.0039000125049579717\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n\
\ \"acc_stderr\": 0.024892469172462836,\n \"acc_norm\": 0.7419354838709677,\n\
\ \"acc_norm_stderr\": 0.024892469172462836\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396997,\n\
\ \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396997\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.01633288239343138,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.01633288239343138\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.02886743144984932,\n\
\ \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02886743144984932\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057222,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057222\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n\
\ \"acc_stderr\": 0.013853724170922524,\n \"acc_norm\": 0.8160919540229885,\n\
\ \"acc_norm_stderr\": 0.013853724170922524\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33743016759776534,\n\
\ \"acc_stderr\": 0.01581390128391305,\n \"acc_norm\": 0.33743016759776534,\n\
\ \"acc_norm_stderr\": 0.01581390128391305\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153262,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153262\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n\
\ \"acc_stderr\": 0.012656810383983964,\n \"acc_norm\": 0.4335071707953064,\n\
\ \"acc_norm_stderr\": 0.012656810383983964\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6486928104575164,\n \"acc_stderr\": 0.01931267606578656,\n \
\ \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.01931267606578656\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.423359772241798,\n\
\ \"mc2_stderr\": 0.014387833736527744\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.011570614861409347\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3206974981046247,\n \
\ \"acc_stderr\": 0.012856468433722302\n }\n}\n```"
repo_url: https://huggingface.co/sarahlintang/mistral-indo-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|arc:challenge|25_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|gsm8k|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hellaswag|10_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T21-47-22.078031.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T21-47-22.078031.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- '**/details_harness|winogrande|5_2024-02-03T21-47-22.078031.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-03T21-47-22.078031.parquet'
- config_name: results
data_files:
- split: 2024_02_03T21_47_22.078031
path:
- results_2024-02-03T21-47-22.078031.parquet
- split: latest
path:
- results_2024-02-03T21-47-22.078031.parquet
---
# Dataset Card for Evaluation run of sarahlintang/mistral-indo-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sarahlintang/mistral-indo-7b](https://huggingface.co/sarahlintang/mistral-indo-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sarahlintang__mistral-indo-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T21:47:22.078031](https://huggingface.co/datasets/open-llm-leaderboard/details_sarahlintang__mistral-indo-7b/blob/main/results_2024-02-03T21-47-22.078031.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6261916134620902,
"acc_stderr": 0.032466151879964746,
"acc_norm": 0.6326766330349893,
"acc_norm_stderr": 0.033132950603814465,
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.423359772241798,
"mc2_stderr": 0.014387833736527744
},
"harness|arc:challenge|25": {
"acc": 0.5725255972696246,
"acc_stderr": 0.014456862944650647,
"acc_norm": 0.6109215017064846,
"acc_norm_stderr": 0.014247309976045607
},
"harness|hellaswag|10": {
"acc": 0.6156144194383589,
"acc_stderr": 0.0048545552940175585,
"acc_norm": 0.8118900617406891,
"acc_norm_stderr": 0.0039000125049579717
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.024892469172462836,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.024892469172462836
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396997,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396997
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.01633288239343138,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.01633288239343138
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02886743144984932,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02886743144984932
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057222,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057222
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922524,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922524
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33743016759776534,
"acc_stderr": 0.01581390128391305,
"acc_norm": 0.33743016759776534,
"acc_norm_stderr": 0.01581390128391305
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153262,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153262
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4335071707953064,
"acc_stderr": 0.012656810383983964,
"acc_norm": 0.4335071707953064,
"acc_norm_stderr": 0.012656810383983964
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.01931267606578656,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.01931267606578656
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.423359772241798,
"mc2_stderr": 0.014387833736527744
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.011570614861409347
},
"harness|gsm8k|5": {
"acc": 0.3206974981046247,
"acc_stderr": 0.012856468433722302
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
xwjzds/extractive_qa_question_answering_hr | ---
language:
- en
license: apache-2.0
---
# Dataset Card
<!-- Provide a quick summary of the dataset. -->
HR-Multiwoz is a fully-labeled dataset of 5980 extractive qa spanning 10 HR domains to evaluate LLM Agent. It is the first labeled open-sourced conversation dataset in the HR domain for NLP research.
Please refer to [HR-MultiWOZ: A Task Oriented Dialogue (TOD) Dataset for HR LLM Agent](https://arxiv.org/pdf/2402.01018.pdf) for details about the dataset construction.
### Dataset Description
- **Language(s) (NLP):** English
- **License:** MIT
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** [xwjzds/extractive_qa_question_answering_hr](https://huggingface.co/datasets/xwjzds/extractive_qa_question_answering_hr)
- **Paper:** [HR-MultiWOZ: A Task Oriented Dialogue (TOD) Dataset for HR LLM Agent](https://arxiv.org/pdf/2402.01018.pdf)
- **Leaderboard:** [github repo](https://github.com/amazon-science/hr-multiwoz-tod-llm-agent)
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
This dataset has been designed to evaluate transfer learning ability for extractive QA algorithms.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
This dataset is not intended for use in training.
## Dataset Structure
### Data Instances
A typical data entry in the dataset consists of answer_context, question, answer. Below is an example from the dataset:
```python
question = "What is the main topic or subject of the training you are requesting?"
answer = "machine learning"
answer_context = "Employee: We're hoping to improve our machine learning research skills. But no special accommodations are needed."
```
### Data Fields
The dataset comprises the following fields:
-'question': a string that represents a question
-'answer': a string that represents an answer
- 'answer_context': a string where the answer is included
## Dataset Creation
Please refer to [HR-MultiWOZ: A Task Oriented Dialogue (TOD) Dataset for HR LLM Agent](https://arxiv.org/pdf/2402.01018.pdf) for details about the dataset construction.
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
Not Amazon
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
None
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
This dataset is in English and contains systhetic problems.
## Citation
If you find this work useful in your method, you can cite the paper as below:
```
@inproceedings{xu-etal-2024-hr,
title = "{HR}-{M}ulti{WOZ}: A Task Oriented Dialogue ({TOD}) Dataset for {HR} {LLM} Agent",
author = "Xu, Weijie and
Huang, Zicheng and
Hu, Wenxiang and
Fang, Xi and
Cherukuri, Rajesh and
Nayyar, Naumaan and
Malandri, Lorenzo and
Sengamedu, Srinivasan",
editor = "Hruschka, Estevam and
Lake, Thom and
Otani, Naoki and
Mitchell, Tom",
booktitle = "Proceedings of the First Workshop on Natural Language Processing for Human Resources (NLP4HR 2024)",
month = mar,
year = "2024",
address = "St. Julian{'}s, Malta",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2024.nlp4hr-1.5",
pages = "59--72",
abstract = "Recent advancements in Large Language Models (LLMs) have been reshaping Natural Language Processing (NLP) task in several domains. Their use in the field of Human Resources (HR) has still room for expansions and could be beneficial for several time consuming tasks. Examples such as time-off submissions, medical claims filing, and access requests are noteworthy, but they are by no means the sole instances. However the aforementioned developments must grapple with the pivotal challenge of constructing a high-quality training dataset. On one hand, most conversation datasets are solving problems for customers not employees. On the other hand, gathering conversations with HR could raise privacy concerns. To solve it, we introduce HR-Multiwoz, a fully-labeled dataset of 550 conversations spanning 10 HR domains. Our work has the following contributions:(1) It is the first labeled open-sourced conversation dataset in the HR domain for NLP research. (2) It provides a detailed recipe for the data generation procedure along with data analysis and human evaluations. The data generation pipeline is transferrable and can be easily adapted for labeled conversation data generation in other domains. (3) The proposed data-collection pipeline is mostly based on LLMs with minimal human involvement for annotation, which is time and cost-efficient.",
}
```
|
jmamou/augmented-glue-sst2 | ---
annotations_creators:
- machine-generated
extended:
- original
language_creators:
- machine-generated
language:
- en-US
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- sentiment-classification
---
# Dataset Card for Augmented-GLUE-SST2
Automatically augmented data from train split of SST-2 dataset using conditional text generation approach.
Code used to generate this file will be soon available at https://github.com/IntelLabs/nlp-architect.
|
Jiqing/ProtST-BetaLactamase | ---
license: unknown
---
|
efederici/wn_xpt_new | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4959142030.197477
num_examples: 1520382
download_size: 5157431627
dataset_size: 4959142030.197477
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wn-xpt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/reborntomasterthebladefromherokingtoextraordinarysquire | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Reborn To Master The Blade From Hero-king To Extraordinary Squire
This is the image base of bangumi Reborn to Master the Blade From Hero-King to Extraordinary Squire, we detected 38 characters, 1790 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 12 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 14 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 20 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 7 | [Download](3/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 4 | 6 | [Download](4/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 5 | 24 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 9 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 64 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 25 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 49 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 28 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 9 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 56 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 207 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 39 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 40 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 53 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 53 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 45 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 210 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 22 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 26 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 14 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 34 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 7 | [Download](24/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 25 | 28 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 63 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 148 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 6 | [Download](28/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 29 | 67 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 19 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 18 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 16 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 8 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 88 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 33 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 8 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 215 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
AdapterOcean/med_alpaca_standardized_cluster_31_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 20273739
num_examples: 32064
download_size: 10128239
dataset_size: 20273739
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_31_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
unlabeledstudiosllc/client-messages | ---
license: unknown
---
|
citclass/citclass_91 | ---
license: agpl-3.0
---
|
open-llm-leaderboard/details_abacusai__bigstral-12b-32k | ---
pretty_name: Evaluation run of abacusai/bigstral-12b-32k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abacusai/bigstral-12b-32k](https://huggingface.co/abacusai/bigstral-12b-32k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abacusai__bigstral-12b-32k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T07:04:14.941026](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__bigstral-12b-32k/blob/main/results_2024-04-09T07-04-14.941026.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5886067247133707,\n\
\ \"acc_stderr\": 0.033453973801537036,\n \"acc_norm\": 0.5950897976915057,\n\
\ \"acc_norm_stderr\": 0.03415396911104142,\n \"mc1\": 0.5165238678090576,\n\
\ \"mc1_stderr\": 0.01749394019005772,\n \"mc2\": 0.6816806117025854,\n\
\ \"mc2_stderr\": 0.015370597773942923\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5614334470989761,\n \"acc_stderr\": 0.014500682618212864,\n\
\ \"acc_norm\": 0.5964163822525598,\n \"acc_norm_stderr\": 0.014337158914268448\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6587333200557658,\n\
\ \"acc_stderr\": 0.004731657228906986,\n \"acc_norm\": 0.8410675164309899,\n\
\ \"acc_norm_stderr\": 0.0036486590414936413\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365245,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365245\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.03772446857518027,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.03772446857518027\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.032662042990646796,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.032662042990646796\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137292,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137292\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n\
\ \"acc_stderr\": 0.026662010578567104,\n \"acc_norm\": 0.6741935483870968,\n\
\ \"acc_norm_stderr\": 0.026662010578567104\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959215,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391244,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391244\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124488,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124488\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.541025641025641,\n \"acc_stderr\": 0.025265525491284295,\n \
\ \"acc_norm\": 0.541025641025641,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.031631458075523776,\n\
\ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.031631458075523776\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7724770642201835,\n \"acc_stderr\": 0.017974463578776502,\n \"\
acc_norm\": 0.7724770642201835,\n \"acc_norm_stderr\": 0.017974463578776502\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.0286265479124374,\n \"acc_norm\"\
: 0.7892156862745098,\n \"acc_norm_stderr\": 0.0286265479124374\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035307,\n \"\
acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035307\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097654,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097654\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724147,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724147\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326467,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326467\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.02466249684520982,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.02466249684520982\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.01567100600933957,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.01567100600933957\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688228,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688228\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4033519553072626,\n\
\ \"acc_stderr\": 0.016407123032195253,\n \"acc_norm\": 0.4033519553072626,\n\
\ \"acc_norm_stderr\": 0.016407123032195253\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.02773283435336394,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.02773283435336394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.025842248700902168,\n\
\ \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.025842248700902168\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4589308996088657,\n\
\ \"acc_stderr\": 0.012727084826799802,\n \"acc_norm\": 0.4589308996088657,\n\
\ \"acc_norm_stderr\": 0.012727084826799802\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.03004261583271487,\n\
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03004261583271487\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6209150326797386,\n \"acc_stderr\": 0.019627444748412232,\n \
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.019627444748412232\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982055,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982055\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.029705284056772432,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.029705284056772432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5165238678090576,\n\
\ \"mc1_stderr\": 0.01749394019005772,\n \"mc2\": 0.6816806117025854,\n\
\ \"mc2_stderr\": 0.015370597773942923\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7474348855564326,\n \"acc_stderr\": 0.012211148449394105\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.26459438968915844,\n \
\ \"acc_stderr\": 0.012150554001563238\n }\n}\n```"
repo_url: https://huggingface.co/abacusai/bigstral-12b-32k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|arc:challenge|25_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|arc:challenge|25_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|gsm8k|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|gsm8k|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hellaswag|10_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hellaswag|10_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-28-15.531868.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-04-14.941026.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T07-04-14.941026.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- '**/details_harness|winogrande|5_2024-03-09T19-28-15.531868.parquet'
- split: 2024_04_09T07_04_14.941026
path:
- '**/details_harness|winogrande|5_2024-04-09T07-04-14.941026.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T07-04-14.941026.parquet'
- config_name: results
data_files:
- split: 2024_03_09T19_28_15.531868
path:
- results_2024-03-09T19-28-15.531868.parquet
- split: 2024_04_09T07_04_14.941026
path:
- results_2024-04-09T07-04-14.941026.parquet
- split: latest
path:
- results_2024-04-09T07-04-14.941026.parquet
---
# Dataset Card for Evaluation run of abacusai/bigstral-12b-32k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abacusai/bigstral-12b-32k](https://huggingface.co/abacusai/bigstral-12b-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abacusai__bigstral-12b-32k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T07:04:14.941026](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__bigstral-12b-32k/blob/main/results_2024-04-09T07-04-14.941026.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5886067247133707,
"acc_stderr": 0.033453973801537036,
"acc_norm": 0.5950897976915057,
"acc_norm_stderr": 0.03415396911104142,
"mc1": 0.5165238678090576,
"mc1_stderr": 0.01749394019005772,
"mc2": 0.6816806117025854,
"mc2_stderr": 0.015370597773942923
},
"harness|arc:challenge|25": {
"acc": 0.5614334470989761,
"acc_stderr": 0.014500682618212864,
"acc_norm": 0.5964163822525598,
"acc_norm_stderr": 0.014337158914268448
},
"harness|hellaswag|10": {
"acc": 0.6587333200557658,
"acc_stderr": 0.004731657228906986,
"acc_norm": 0.8410675164309899,
"acc_norm_stderr": 0.0036486590414936413
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365245,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365245
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518027,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518027
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.032662042990646796,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.032662042990646796
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137292,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137292
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6741935483870968,
"acc_stderr": 0.026662010578567104,
"acc_norm": 0.6741935483870968,
"acc_norm_stderr": 0.026662010578567104
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391244,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391244
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124488,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.541025641025641,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.541025641025641,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253252,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253252
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.031631458075523776,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.031631458075523776
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7724770642201835,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.7724770642201835,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.0286265479124374,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.0286265479124374
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035307,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035307
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097654,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097654
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724147,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724147
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326467,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326467
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.02466249684520982,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.02466249684520982
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.01567100600933957,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.01567100600933957
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688228,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688228
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4033519553072626,
"acc_stderr": 0.016407123032195253,
"acc_norm": 0.4033519553072626,
"acc_norm_stderr": 0.016407123032195253
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.02773283435336394,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.02773283435336394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.025842248700902168,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.025842248700902168
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4589308996088657,
"acc_stderr": 0.012727084826799802,
"acc_norm": 0.4589308996088657,
"acc_norm_stderr": 0.012727084826799802
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03004261583271487,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03004261583271487
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.019627444748412232,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.019627444748412232
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982055,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982055
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772432,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772432
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5165238678090576,
"mc1_stderr": 0.01749394019005772,
"mc2": 0.6816806117025854,
"mc2_stderr": 0.015370597773942923
},
"harness|winogrande|5": {
"acc": 0.7474348855564326,
"acc_stderr": 0.012211148449394105
},
"harness|gsm8k|5": {
"acc": 0.26459438968915844,
"acc_stderr": 0.012150554001563238
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
joey234/mmlu-high_school_physics-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 8534
num_examples: 5
- name: test
num_bytes: 1413219
num_examples: 151
download_size: 206264
dataset_size: 1421753
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-high_school_physics-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jsfs11__WestOrcaNeuralMarco-DPO-v2-DARETIES-7B | ---
pretty_name: Evaluation run of jsfs11/WestOrcaNeuralMarco-DPO-v2-DARETIES-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jsfs11/WestOrcaNeuralMarco-DPO-v2-DARETIES-7B](https://huggingface.co/jsfs11/WestOrcaNeuralMarco-DPO-v2-DARETIES-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jsfs11__WestOrcaNeuralMarco-DPO-v2-DARETIES-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-23T07:53:57.376778](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__WestOrcaNeuralMarco-DPO-v2-DARETIES-7B/blob/main/results_2024-01-23T07-53-57.376778.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.655303418618015,\n\
\ \"acc_stderr\": 0.03206026718388969,\n \"acc_norm\": 0.6549524807930509,\n\
\ \"acc_norm_stderr\": 0.03272590499883366,\n \"mc1\": 0.5165238678090576,\n\
\ \"mc1_stderr\": 0.01749394019005772,\n \"mc2\": 0.6596210038626965,\n\
\ \"mc2_stderr\": 0.015322599620782891\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6945392491467577,\n \"acc_stderr\": 0.013460080478002508,\n\
\ \"acc_norm\": 0.7192832764505119,\n \"acc_norm_stderr\": 0.01313123812697557\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7074287990440151,\n\
\ \"acc_stderr\": 0.0045401340050603214,\n \"acc_norm\": 0.880601473809998,\n\
\ \"acc_norm_stderr\": 0.003235941810943157\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.03533133389323657,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.03533133389323657\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n\
\ \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n\
\ \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n\
\ \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n\
\ \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n\
\ \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"\
acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465066,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465066\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n\
\ \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.0257449025322909,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.0257449025322909\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n\
\ \"acc_stderr\": 0.016558601636041035,\n \"acc_norm\": 0.4301675977653631,\n\
\ \"acc_norm_stderr\": 0.016558601636041035\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.023468429832451166,\n\
\ \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.023468429832451166\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.012741974333897227,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.012741974333897227\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000325,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000325\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827044,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827044\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5165238678090576,\n\
\ \"mc1_stderr\": 0.01749394019005772,\n \"mc2\": 0.6596210038626965,\n\
\ \"mc2_stderr\": 0.015322599620782891\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8279400157853196,\n \"acc_stderr\": 0.010607731615247022\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7012888551933283,\n \
\ \"acc_stderr\": 0.012607137125693635\n }\n}\n```"
repo_url: https://huggingface.co/jsfs11/WestOrcaNeuralMarco-DPO-v2-DARETIES-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|arc:challenge|25_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|gsm8k|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hellaswag|10_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T07-53-57.376778.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T07-53-57.376778.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- '**/details_harness|winogrande|5_2024-01-23T07-53-57.376778.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-23T07-53-57.376778.parquet'
- config_name: results
data_files:
- split: 2024_01_23T07_53_57.376778
path:
- results_2024-01-23T07-53-57.376778.parquet
- split: latest
path:
- results_2024-01-23T07-53-57.376778.parquet
---
# Dataset Card for Evaluation run of jsfs11/WestOrcaNeuralMarco-DPO-v2-DARETIES-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jsfs11/WestOrcaNeuralMarco-DPO-v2-DARETIES-7B](https://huggingface.co/jsfs11/WestOrcaNeuralMarco-DPO-v2-DARETIES-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jsfs11__WestOrcaNeuralMarco-DPO-v2-DARETIES-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T07:53:57.376778](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__WestOrcaNeuralMarco-DPO-v2-DARETIES-7B/blob/main/results_2024-01-23T07-53-57.376778.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.655303418618015,
"acc_stderr": 0.03206026718388969,
"acc_norm": 0.6549524807930509,
"acc_norm_stderr": 0.03272590499883366,
"mc1": 0.5165238678090576,
"mc1_stderr": 0.01749394019005772,
"mc2": 0.6596210038626965,
"mc2_stderr": 0.015322599620782891
},
"harness|arc:challenge|25": {
"acc": 0.6945392491467577,
"acc_stderr": 0.013460080478002508,
"acc_norm": 0.7192832764505119,
"acc_norm_stderr": 0.01313123812697557
},
"harness|hellaswag|10": {
"acc": 0.7074287990440151,
"acc_stderr": 0.0045401340050603214,
"acc_norm": 0.880601473809998,
"acc_norm_stderr": 0.003235941810943157
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465066,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465066
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.0257449025322909,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.0257449025322909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.016558601636041035,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.016558601636041035
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.023468429832451166,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.023468429832451166
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897227,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897227
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827044,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827044
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5165238678090576,
"mc1_stderr": 0.01749394019005772,
"mc2": 0.6596210038626965,
"mc2_stderr": 0.015322599620782891
},
"harness|winogrande|5": {
"acc": 0.8279400157853196,
"acc_stderr": 0.010607731615247022
},
"harness|gsm8k|5": {
"acc": 0.7012888551933283,
"acc_stderr": 0.012607137125693635
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
NLPC-UOM/English-Tamil-Parallel-Corpus | ---
annotations_creators: []
languages:
- si
licenses:
- mit
---
# English-Tamil parallel Corpus prepared by the National Languages Processing Center, University of Moratuwa. The data has been cleaned and then aligned.
#En-Ta Glossary Line Count : 22477
#En-Ta Corpus Line Count : 8950
#Source: Data extracted from publicly available government resources such as annual reports, procurement reports, circulars and websites.
#Processing: Each word/pdf file was converted to text files, and unicode errors were fixed using a custom tool. Then the Tamil and English files were manually sentence-aligned. All the spelling and grammatical errors were manually fixed.
#If you use this dataset, kindly cite the following publication:
Fernando, A., Ranathunga, S., & Dias, G. (2020). Data Augmentation and Terminology Integration for Domain-Specific Sinhala-English-Tamil Statistical Machine Translation. arXiv preprint arXiv:2011.02821.
|
tnash6/myFirstDataset | ---
license: cc-by-3.0
---
|
stjarvie/question_to_sql_with_ddl_test_2 | ---
dataset_info:
features:
- name: question
dtype: string
- name: sql
dtype: string
- name: schema
dtype: string
splits:
- name: test
num_bytes: 2005
num_examples: 10
download_size: 3421
dataset_size: 2005
---
# Dataset Card for "question_to_sql_with_ddl_test_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bjornbundgaard/newdatacolab | ---
license: unknown
---
|
open-llm-leaderboard/details_manishiitg__open-aditi-hi-v1 | ---
pretty_name: Evaluation run of manishiitg/open-aditi-hi-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [manishiitg/open-aditi-hi-v1](https://huggingface.co/manishiitg/open-aditi-hi-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_manishiitg__open-aditi-hi-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T00:54:17.291176](https://huggingface.co/datasets/open-llm-leaderboard/details_manishiitg__open-aditi-hi-v1/blob/main/results_2024-01-05T00-54-17.291176.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.583972416790564,\n\
\ \"acc_stderr\": 0.033496039335568516,\n \"acc_norm\": 0.5889834623305892,\n\
\ \"acc_norm_stderr\": 0.03419061870668813,\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.01572313952460875,\n \"mc2\": 0.42336343098693335,\n\
\ \"mc2_stderr\": 0.014576483672407868\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5563139931740614,\n \"acc_stderr\": 0.01451842182567045,\n\
\ \"acc_norm\": 0.5878839590443686,\n \"acc_norm_stderr\": 0.014383915302225402\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6185022903804023,\n\
\ \"acc_stderr\": 0.004847615216473459,\n \"acc_norm\": 0.8137821151165107,\n\
\ \"acc_norm_stderr\": 0.003884868131822894\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464241,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464241\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\
\ \"acc_stderr\": 0.03784271932887468,\n \"acc_norm\": 0.5606936416184971,\n\
\ \"acc_norm_stderr\": 0.03784271932887468\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137282,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137282\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7225806451612903,\n \"acc_stderr\": 0.025470196835900055,\n \"\
acc_norm\": 0.7225806451612903,\n \"acc_norm_stderr\": 0.025470196835900055\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n \"\
acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124495,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124495\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5641025641025641,\n \"acc_stderr\": 0.025141801511177495,\n\
\ \"acc_norm\": 0.5641025641025641,\n \"acc_norm_stderr\": 0.025141801511177495\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7357798165137615,\n \"acc_stderr\": 0.018904164171510175,\n \"\
acc_norm\": 0.7357798165137615,\n \"acc_norm_stderr\": 0.018904164171510175\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.02977177522814563,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02977177522814563\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.04010358942462203,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.04010358942462203\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.02390232554956039,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.02390232554956039\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n\
\ \"acc_stderr\": 0.01541130876968693,\n \"acc_norm\": 0.7535121328224776,\n\
\ \"acc_norm_stderr\": 0.01541130876968693\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069716,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3195530726256983,\n\
\ \"acc_stderr\": 0.01559552029414741,\n \"acc_norm\": 0.3195530726256983,\n\
\ \"acc_norm_stderr\": 0.01559552029414741\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.026716118380156847,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.026716118380156847\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.02666441088693762,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.02666441088693762\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409828,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409828\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291484,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291484\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4361147327249022,\n\
\ \"acc_stderr\": 0.012665568135455326,\n \"acc_norm\": 0.4361147327249022,\n\
\ \"acc_norm_stderr\": 0.012665568135455326\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.030233758551596455,\n\
\ \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.030233758551596455\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5915032679738562,\n \"acc_stderr\": 0.01988622103750187,\n \
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.01988622103750187\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683903,\n\
\ \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683903\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.030769444967296014,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.030769444967296014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.03301405946987249,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.03301405946987249\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.01572313952460875,\n \"mc2\": 0.42336343098693335,\n\
\ \"mc2_stderr\": 0.014576483672407868\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650877\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.33434420015163,\n \
\ \"acc_stderr\": 0.012994634003332764\n }\n}\n```"
repo_url: https://huggingface.co/manishiitg/open-aditi-hi-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|arc:challenge|25_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|gsm8k|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hellaswag|10_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-54-17.291176.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T00-54-17.291176.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- '**/details_harness|winogrande|5_2024-01-05T00-54-17.291176.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T00-54-17.291176.parquet'
- config_name: results
data_files:
- split: 2024_01_05T00_54_17.291176
path:
- results_2024-01-05T00-54-17.291176.parquet
- split: latest
path:
- results_2024-01-05T00-54-17.291176.parquet
---
# Dataset Card for Evaluation run of manishiitg/open-aditi-hi-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [manishiitg/open-aditi-hi-v1](https://huggingface.co/manishiitg/open-aditi-hi-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_manishiitg__open-aditi-hi-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T00:54:17.291176](https://huggingface.co/datasets/open-llm-leaderboard/details_manishiitg__open-aditi-hi-v1/blob/main/results_2024-01-05T00-54-17.291176.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.583972416790564,
"acc_stderr": 0.033496039335568516,
"acc_norm": 0.5889834623305892,
"acc_norm_stderr": 0.03419061870668813,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.01572313952460875,
"mc2": 0.42336343098693335,
"mc2_stderr": 0.014576483672407868
},
"harness|arc:challenge|25": {
"acc": 0.5563139931740614,
"acc_stderr": 0.01451842182567045,
"acc_norm": 0.5878839590443686,
"acc_norm_stderr": 0.014383915302225402
},
"harness|hellaswag|10": {
"acc": 0.6185022903804023,
"acc_stderr": 0.004847615216473459,
"acc_norm": 0.8137821151165107,
"acc_norm_stderr": 0.003884868131822894
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464241,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464241
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.03784271932887468,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.03784271932887468
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137282,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137282
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5641025641025641,
"acc_stderr": 0.025141801511177495,
"acc_norm": 0.5641025641025641,
"acc_norm_stderr": 0.025141801511177495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7357798165137615,
"acc_stderr": 0.018904164171510175,
"acc_norm": 0.7357798165137615,
"acc_norm_stderr": 0.018904164171510175
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02977177522814563,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02977177522814563
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292406,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292406
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.02390232554956039,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.02390232554956039
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7535121328224776,
"acc_stderr": 0.01541130876968693,
"acc_norm": 0.7535121328224776,
"acc_norm_stderr": 0.01541130876968693
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069716,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3195530726256983,
"acc_stderr": 0.01559552029414741,
"acc_norm": 0.3195530726256983,
"acc_norm_stderr": 0.01559552029414741
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.026716118380156847,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.026716118380156847
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.02666441088693762,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.02666441088693762
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409828,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409828
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291484,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291484
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4361147327249022,
"acc_stderr": 0.012665568135455326,
"acc_norm": 0.4361147327249022,
"acc_norm_stderr": 0.012665568135455326
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5477941176470589,
"acc_stderr": 0.030233758551596455,
"acc_norm": 0.5477941176470589,
"acc_norm_stderr": 0.030233758551596455
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.01988622103750187,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.01988622103750187
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683903,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683903
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.030769444967296014,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.030769444967296014
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.03301405946987249,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.03301405946987249
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.01572313952460875,
"mc2": 0.42336343098693335,
"mc2_stderr": 0.014576483672407868
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650877
},
"harness|gsm8k|5": {
"acc": 0.33434420015163,
"acc_stderr": 0.012994634003332764
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
btemirov/distill-whisper-fin-jargon | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 6999933852.0
num_examples: 43
- name: test
num_bytes: 6863369648.0
num_examples: 53
download_size: 12039318337
dataset_size: 13863303500.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
AY027/sample_leg | ---
dataset_info:
features:
- name: file_name
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 148.4
num_examples: 7
- name: test
num_bytes: 63.6
num_examples: 3
download_size: 2524
dataset_size: 212.0
---
# Dataset Card for "sample_leg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AmelieSchreiber/family_split_protein_binding_sites | ---
license: mit
task_categories:
- token-classification
language:
- en
tags:
- protein
pretty_name: UniProt Segmented Binding/Active Sites
size_categories:
- 100K<n<1M
---
# UniProt Segmented Binding/Active Sites
This is a train/test split of 209,571 protein sequences from UniProt of protein sequences with active sites and binding sites labels.
All protein sequences (and corresponding binding site labels) are segmented into chunks of 1000 or less. Segmented sequences are indicated
by a `_partN` suffix in the Entry column labels. The split is approximately 85/15 before segmentation. The proteins are sorted by family in
decreasing order, with families with more protein sequences appearing earlier. Moreover, the split is such that the families in the
train/test split are non-overlapping and the `15%` test dataset is composed of the largest families. |
Ara88/MRI_images_brain_tumors | ---
license: cc-by-4.0
---
|
Firminoleo/alinevoz | ---
license: openrail
---
|
CyberHarem/i_8_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of i_8 (Kantai Collection)
This is the dataset of i_8 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `blonde_hair, glasses, twintails, red-framed_eyewear, breasts, hat, low_twintails, long_hair, large_breasts, semi-rimless_eyewear, blue_eyes, under-rim_eyewear, peaked_cap, ahoge, sailor_hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 436.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_8_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 291.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_8_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1164 | 625.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_8_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 403.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_8_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1164 | 810.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_8_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/i_8_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1boy, 1girl, hetero, solo_focus, blush, paizuri, penis, one-piece_swimsuit, school_swimsuit, cum, censored, green_eyes, nipples, open_mouth, smile |
| 1 | 32 |  |  |  |  |  | 1girl, one-piece_swimsuit, school_swimsuit, solo, looking_at_viewer, smile, white_thighhighs, name_tag, book, green_eyes, white_background, torpedo |
| 2 | 7 |  |  |  |  |  | 1girl, name_tag, one-piece_swimsuit, school_swimsuit, solo, upper_body, looking_at_viewer, smile, simple_background, twitter_username, white_background |
| 3 | 6 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, navel, solo, blush, collarbone, cowboy_shot, side-tie_bikini_bottom, simple_background, white_background, white_bikini, hair_between_eyes, twitter_username, cropped_legs, groin, one-hour_drawing_challenge |
| 4 | 7 |  |  |  |  |  | blush, lemon_slice, smile, white_dress, white_gloves, 1girl, drinking_glass, fur_trim, looking_at_viewer, solo, capelet, holding_cup, hair_ribbon, open_mouth, simple_background, upper_body |
| 5 | 14 |  |  |  |  |  | 1boy, hetero, solo_focus, 1girl, blush, nipples, open_mouth, school_swimsuit, thighhighs, sex, vaginal, penis, swimsuit_aside, spread_legs, green_eyes, mosaic_censoring, on_back, sweat, cum_in_pussy, missionary, one-piece_swimsuit_pull, saliva |
| 6 | 5 |  |  |  |  |  | alternate_costume, detached_collar, fake_animal_ears, playboy_bunny, rabbit_ears, strapless_leotard, wrist_cuffs, 1girl, blush, looking_at_viewer, rabbit_tail, solo, book, open_mouth, blue_leotard, bowtie, cleavage, covered_navel, fake_tail, fishnet_pantyhose, hair_between_eyes, high_heels, medium_breasts, pink_background, smile, twitter_username, white_leotard, white_thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | 1girl | hetero | solo_focus | blush | paizuri | penis | one-piece_swimsuit | school_swimsuit | cum | censored | green_eyes | nipples | open_mouth | smile | solo | looking_at_viewer | white_thighhighs | name_tag | book | white_background | torpedo | upper_body | simple_background | twitter_username | cleavage | navel | collarbone | cowboy_shot | side-tie_bikini_bottom | white_bikini | hair_between_eyes | cropped_legs | groin | one-hour_drawing_challenge | lemon_slice | white_dress | white_gloves | drinking_glass | fur_trim | capelet | holding_cup | hair_ribbon | thighhighs | sex | vaginal | swimsuit_aside | spread_legs | mosaic_censoring | on_back | sweat | cum_in_pussy | missionary | one-piece_swimsuit_pull | saliva | alternate_costume | detached_collar | fake_animal_ears | playboy_bunny | rabbit_ears | strapless_leotard | wrist_cuffs | rabbit_tail | blue_leotard | bowtie | covered_navel | fake_tail | fishnet_pantyhose | high_heels | medium_breasts | pink_background | white_leotard |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------|:---------|:-------------|:--------|:----------|:--------|:---------------------|:------------------|:------|:-----------|:-------------|:----------|:-------------|:--------|:-------|:--------------------|:-------------------|:-----------|:-------|:-------------------|:----------|:-------------|:--------------------|:-------------------|:-----------|:--------|:-------------|:--------------|:-------------------------|:---------------|:--------------------|:---------------|:--------|:-----------------------------|:--------------|:--------------|:---------------|:-----------------|:-----------|:----------|:--------------|:--------------|:-------------|:------|:----------|:-----------------|:--------------|:-------------------|:----------|:--------|:---------------|:-------------|:--------------------------|:---------|:--------------------|:------------------|:-------------------|:----------------|:--------------|:--------------------|:--------------|:--------------|:---------------|:---------|:----------------|:------------|:--------------------|:-------------|:-----------------|:------------------|:----------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 32 |  |  |  |  |  | | X | | | | | | X | X | | | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | | X | | | | | | X | X | | | | | | X | X | X | | X | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | | X | | | X | | | | | | | | | | | X | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | | X | | | X | | | | | | | | | X | X | X | X | | | | | | X | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 14 |  |  |  |  |  | X | X | X | X | X | | X | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | | X | | | X | | | | | | | | | X | X | X | X | X | | X | | | | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
TheAIchemist13/gramvaani_preprocessed_hi_train | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 47173100841.0
num_examples: 37080
download_size: 15250516029
dataset_size: 47173100841.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gramvaani_preprocessed_hi_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Harshvardhan27/Wikicorpus_Fine_Tuned_Mistral_FinalCheckpoint | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: input_prompt
dtype: string
- name: output_text
dtype: string
- name: output_length
dtype: int64
- name: output_cleaned
dtype: string
splits:
- name: train
num_bytes: 4287294
num_examples: 1000
- name: test
num_bytes: 849016
num_examples: 200
download_size: 3056330
dataset_size: 5136310
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_arlineka__Brunhilde-13b-v3 | ---
pretty_name: Evaluation run of arlineka/Brunhilde-13b-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [arlineka/Brunhilde-13b-v3](https://huggingface.co/arlineka/Brunhilde-13b-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_arlineka__Brunhilde-13b-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T22:36:06.125500](https://huggingface.co/datasets/open-llm-leaderboard/details_arlineka__Brunhilde-13b-v3/blob/main/results_2024-04-02T22-36-06.125500.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5466857150017056,\n\
\ \"acc_stderr\": 0.03358927853713265,\n \"acc_norm\": 0.5560972322682318,\n\
\ \"acc_norm_stderr\": 0.034442806678616246,\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5298508488280006,\n\
\ \"mc2_stderr\": 0.015694066944062498\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5725255972696246,\n \"acc_stderr\": 0.014456862944650649,\n\
\ \"acc_norm\": 0.6015358361774744,\n \"acc_norm_stderr\": 0.014306946052735567\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6501692889862577,\n\
\ \"acc_stderr\": 0.004759416464201141,\n \"acc_norm\": 0.8401712806213901,\n\
\ \"acc_norm_stderr\": 0.0036569821653861826\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.040260970832965634,\n\
\ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.040260970832965634\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101813,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101813\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147125,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147125\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n\
\ \"acc_stderr\": 0.027273890594300645,\n \"acc_norm\": 0.6419354838709678,\n\
\ \"acc_norm_stderr\": 0.027273890594300645\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.034711928605184676,\n\
\ \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.034711928605184676\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538115,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538115\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.032424979581788166,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.032424979581788166\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5076923076923077,\n \"acc_stderr\": 0.02534800603153477,\n \
\ \"acc_norm\": 0.5076923076923077,\n \"acc_norm_stderr\": 0.02534800603153477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.0279404571362284,\n \"acc_norm\":\
\ 0.3,\n \"acc_norm_stderr\": 0.0279404571362284\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.728440366972477,\n \"acc_stderr\": 0.01906909836319143,\n \"acc_norm\"\
: 0.728440366972477,\n \"acc_norm_stderr\": 0.01906909836319143\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.38425925925925924,\n\
\ \"acc_stderr\": 0.03317354514310742,\n \"acc_norm\": 0.38425925925925924,\n\
\ \"acc_norm_stderr\": 0.03317354514310742\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03039153369274154\n \
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\"\
: 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n \"acc_norm\"\
: 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n },\n\
\ \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.02514093595033544,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.02514093595033544\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7471264367816092,\n\
\ \"acc_stderr\": 0.015543377313719683,\n \"acc_norm\": 0.7471264367816092,\n\
\ \"acc_norm_stderr\": 0.015543377313719683\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.026074314851657083,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.026074314851657083\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34413407821229053,\n\
\ \"acc_stderr\": 0.015889221313307094,\n \"acc_norm\": 0.34413407821229053,\n\
\ \"acc_norm_stderr\": 0.015889221313307094\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023344,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023344\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573083,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573083\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42959582790091266,\n\
\ \"acc_stderr\": 0.012643004623790203,\n \"acc_norm\": 0.42959582790091266,\n\
\ \"acc_norm_stderr\": 0.012643004623790203\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5604575163398693,\n \"acc_stderr\": 0.02007942040808792,\n \
\ \"acc_norm\": 0.5604575163398693,\n \"acc_norm_stderr\": 0.02007942040808792\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.03086214492108756,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.03086214492108756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.03152439186555401,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.03152439186555401\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5298508488280006,\n\
\ \"mc2_stderr\": 0.015694066944062498\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7426992896606156,\n \"acc_stderr\": 0.012285989618865706\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.014404852160727824,\n \
\ \"acc_stderr\": 0.003282055917136976\n }\n}\n```"
repo_url: https://huggingface.co/arlineka/Brunhilde-13b-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|arc:challenge|25_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|gsm8k|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hellaswag|10_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T22-36-06.125500.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T22-36-06.125500.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- '**/details_harness|winogrande|5_2024-04-02T22-36-06.125500.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T22-36-06.125500.parquet'
- config_name: results
data_files:
- split: 2024_04_02T22_36_06.125500
path:
- results_2024-04-02T22-36-06.125500.parquet
- split: latest
path:
- results_2024-04-02T22-36-06.125500.parquet
---
# Dataset Card for Evaluation run of arlineka/Brunhilde-13b-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [arlineka/Brunhilde-13b-v3](https://huggingface.co/arlineka/Brunhilde-13b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_arlineka__Brunhilde-13b-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T22:36:06.125500](https://huggingface.co/datasets/open-llm-leaderboard/details_arlineka__Brunhilde-13b-v3/blob/main/results_2024-04-02T22-36-06.125500.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5466857150017056,
"acc_stderr": 0.03358927853713265,
"acc_norm": 0.5560972322682318,
"acc_norm_stderr": 0.034442806678616246,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.016965517578930354,
"mc2": 0.5298508488280006,
"mc2_stderr": 0.015694066944062498
},
"harness|arc:challenge|25": {
"acc": 0.5725255972696246,
"acc_stderr": 0.014456862944650649,
"acc_norm": 0.6015358361774744,
"acc_norm_stderr": 0.014306946052735567
},
"harness|hellaswag|10": {
"acc": 0.6501692889862577,
"acc_stderr": 0.004759416464201141,
"acc_norm": 0.8401712806213901,
"acc_norm_stderr": 0.0036569821653861826
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.040260970832965634,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.040260970832965634
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.03019761160019795,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.03019761160019795
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101813,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101813
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147125,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147125
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300645,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300645
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.034711928605184676,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.034711928605184676
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.036462049632538115,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.036462049632538115
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.032424979581788166,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.032424979581788166
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7772020725388601,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.7772020725388601,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5076923076923077,
"acc_stderr": 0.02534800603153477,
"acc_norm": 0.5076923076923077,
"acc_norm_stderr": 0.02534800603153477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.0279404571362284,
"acc_norm": 0.3,
"acc_norm_stderr": 0.0279404571362284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.728440366972477,
"acc_stderr": 0.01906909836319143,
"acc_norm": 0.728440366972477,
"acc_norm_stderr": 0.01906909836319143
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.02514093595033544,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.02514093595033544
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7471264367816092,
"acc_stderr": 0.015543377313719683,
"acc_norm": 0.7471264367816092,
"acc_norm_stderr": 0.015543377313719683
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.026074314851657083,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.026074314851657083
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34413407821229053,
"acc_stderr": 0.015889221313307094,
"acc_norm": 0.34413407821229053,
"acc_norm_stderr": 0.015889221313307094
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.027780141207023344,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.027780141207023344
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573083,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573083
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42959582790091266,
"acc_stderr": 0.012643004623790203,
"acc_norm": 0.42959582790091266,
"acc_norm_stderr": 0.012643004623790203
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5604575163398693,
"acc_stderr": 0.02007942040808792,
"acc_norm": 0.5604575163398693,
"acc_norm_stderr": 0.02007942040808792
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.03086214492108756,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.03086214492108756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555401,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555401
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.016965517578930354,
"mc2": 0.5298508488280006,
"mc2_stderr": 0.015694066944062498
},
"harness|winogrande|5": {
"acc": 0.7426992896606156,
"acc_stderr": 0.012285989618865706
},
"harness|gsm8k|5": {
"acc": 0.014404852160727824,
"acc_stderr": 0.003282055917136976
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
l-lt/LaSOT | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
viewer: false
paperswithcode_id: lasot
---
# Dataset Card for LaSOT
## Dataset Description
- **Homepage:** [LaSOT homepage](http://vision.cs.stonybrook.edu/~lasot/)
- **Paper:** [LaSOT: A High-quality Benchmark for Large-scale Single Object Tracking](https://arxiv.org/abs/1809.07845)
- **Point of Contact:** [Heng Fan](heng.fan@unt.edu)
### Dataset Summary
**La**rge-scale **S**ingle **O**bject **T**racking (**LaSOT**) aims to provide a dedicated platform for training data-hungry deep trackers as well as assessing long-term tracking performance.
This repository contains the conference version of LaSOT, published in CVPR-19 ([LaSOT: A High-quality Benchmark for Large-scale Single Object Tracking](https://arxiv.org/abs/1809.07845)).
**LaSOT** is featured in:
- **Large-scale**: 1,400 sequences with more than 3.5 millions frames
- **High-quality**: Manual annotation with careful inspection in each frame
- **Category balance**: 70 categories, each containing 20 sequences
- **Long-term tracking**: An average video length of around 2,500 frames (i.e., 83 seconds)
- **Comprehensive labeling**: Providing both visual and lingual annotation for each sequence
For the new subset (15 categories with 150 videos) in [extended journal version](https://arxiv.org/abs/2009.03465) (commonly referred to as LaSOT<sub>ext</sub>), visit this [repo](https://huggingface.co/datasets/l-lt/LaSOT-ext).
## Download
You can download the whole dataset via the ```huggingface_hub``` library ([guide](https://huggingface.co/docs/huggingface_hub/guides/download)):
```python
from huggingface_hub import snapshot_download
snapshot_download(repo_id='l-lt/LaSOT', repo_type='dataset', local_dir='/path/to/download')
```
Alternatively, download the videos of a specific category manually from this [page](https://huggingface.co/datasets/l-lt/LaSOT/tree/main).
LaSOT is also distributed through several cloud storage services (currently only OneDrive):
* As a single zip file: [OneDrive](https://1drv.ms/u/s!Akt_zO4y_u6DgoQsxl9ixr5Y393qWA?e=7yTwjc)
* As one zip file per category: [OneDrive](https://1drv.ms/f/s!Akt_zO4y_u6DgoNSoMJrfnVwveDjhA?e=PBeyuD) or [Baidu Pan](https://pan.baidu.com/s/1xFANiqkBHytE7stMOLUpLQ)
### Setup
Unzip all zip files and the paths should be organized as following:
```
├── airplane
│ ├── airplane-1
│ ...
├── basketball
...
├── training_set.txt
└── testing_set.txt
```
## Evaluation Metrics and Toolkit
See the [homepage](http://vision.cs.stonybrook.edu/~lasot/results.html) for more information. |
ParallelnoMinded/Nest | ---
license: apache-2.0
---
|
Mohamad-Jaallouk/ConstScene | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 495130583.56
num_examples: 2770
- name: validation
num_bytes: 61731350.0
num_examples: 352
- name: test
num_bytes: 61079974.0
num_examples: 348
download_size: 610073366
dataset_size: 617941907.56
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_Eric111__MarcoHermes | ---
pretty_name: Evaluation run of Eric111/MarcoHermes
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Eric111/MarcoHermes](https://huggingface.co/Eric111/MarcoHermes) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Eric111__MarcoHermes\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T02:58:56.333931](https://huggingface.co/datasets/open-llm-leaderboard/details_Eric111__MarcoHermes/blob/main/results_2024-02-10T02-58-56.333931.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.651179536747465,\n\
\ \"acc_stderr\": 0.032176814246608,\n \"acc_norm\": 0.6518376397883119,\n\
\ \"acc_norm_stderr\": 0.032837688243384115,\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693657,\n \"mc2\": 0.5845514058616202,\n\
\ \"mc2_stderr\": 0.015149100918970279\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6262798634812287,\n \"acc_stderr\": 0.014137708601759086,\n\
\ \"acc_norm\": 0.6621160409556314,\n \"acc_norm_stderr\": 0.013822047922283514\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6570404301931886,\n\
\ \"acc_stderr\": 0.004737279691036193,\n \"acc_norm\": 0.855008962358096,\n\
\ \"acc_norm_stderr\": 0.0035137222519546867\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268542,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268542\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313043,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313043\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465718,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465718\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n\
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.0133064782430663,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.0133064782430663\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508283,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35195530726256985,\n\
\ \"acc_stderr\": 0.015972668523689074,\n \"acc_norm\": 0.35195530726256985,\n\
\ \"acc_norm_stderr\": 0.015972668523689074\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4726205997392438,\n \"acc_stderr\": 0.012751075788015055,\n\
\ \"acc_norm\": 0.4726205997392438,\n \"acc_norm_stderr\": 0.012751075788015055\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.01877168389352818,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.01877168389352818\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693657,\n \"mc2\": 0.5845514058616202,\n\
\ \"mc2_stderr\": 0.015149100918970279\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.011082538847491906\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6899166034874905,\n \
\ \"acc_stderr\": 0.01274030571737627\n }\n}\n```"
repo_url: https://huggingface.co/Eric111/MarcoHermes
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|arc:challenge|25_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|gsm8k|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hellaswag|10_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T02-58-56.333931.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T02-58-56.333931.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- '**/details_harness|winogrande|5_2024-02-10T02-58-56.333931.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T02-58-56.333931.parquet'
- config_name: results
data_files:
- split: 2024_02_10T02_58_56.333931
path:
- results_2024-02-10T02-58-56.333931.parquet
- split: latest
path:
- results_2024-02-10T02-58-56.333931.parquet
---
# Dataset Card for Evaluation run of Eric111/MarcoHermes
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Eric111/MarcoHermes](https://huggingface.co/Eric111/MarcoHermes) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Eric111__MarcoHermes",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T02:58:56.333931](https://huggingface.co/datasets/open-llm-leaderboard/details_Eric111__MarcoHermes/blob/main/results_2024-02-10T02-58-56.333931.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.651179536747465,
"acc_stderr": 0.032176814246608,
"acc_norm": 0.6518376397883119,
"acc_norm_stderr": 0.032837688243384115,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693657,
"mc2": 0.5845514058616202,
"mc2_stderr": 0.015149100918970279
},
"harness|arc:challenge|25": {
"acc": 0.6262798634812287,
"acc_stderr": 0.014137708601759086,
"acc_norm": 0.6621160409556314,
"acc_norm_stderr": 0.013822047922283514
},
"harness|hellaswag|10": {
"acc": 0.6570404301931886,
"acc_stderr": 0.004737279691036193,
"acc_norm": 0.855008962358096,
"acc_norm_stderr": 0.0035137222519546867
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813821,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813821
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268542,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268542
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313043,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313043
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465718,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465718
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.0133064782430663,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.0133064782430663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508283,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35195530726256985,
"acc_stderr": 0.015972668523689074,
"acc_norm": 0.35195530726256985,
"acc_norm_stderr": 0.015972668523689074
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015055,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015055
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.01877168389352818,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.01877168389352818
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693657,
"mc2": 0.5845514058616202,
"mc2_stderr": 0.015149100918970279
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.011082538847491906
},
"harness|gsm8k|5": {
"acc": 0.6899166034874905,
"acc_stderr": 0.01274030571737627
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
amazon_polarity | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- sentiment-classification
pretty_name: Amazon Review Polarity
dataset_info:
config_name: amazon_polarity
features:
- name: label
dtype:
class_label:
names:
'0': negative
'1': positive
- name: title
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 1604364432
num_examples: 3600000
- name: test
num_bytes: 178176193
num_examples: 400000
download_size: 1145430497
dataset_size: 1782540625
configs:
- config_name: amazon_polarity
data_files:
- split: train
path: amazon_polarity/train-*
- split: test
path: amazon_polarity/test-*
default: true
train-eval-index:
- config: amazon_polarity
task: text-classification
task_id: binary_classification
splits:
train_split: train
eval_split: test
col_mapping:
content: text
label: target
metrics:
- type: accuracy
name: Accuracy
- type: f1
name: F1 macro
args:
average: macro
- type: f1
name: F1 micro
args:
average: micro
- type: f1
name: F1 weighted
args:
average: weighted
- type: precision
name: Precision macro
args:
average: macro
- type: precision
name: Precision micro
args:
average: micro
- type: precision
name: Precision weighted
args:
average: weighted
- type: recall
name: Recall macro
args:
average: macro
- type: recall
name: Recall micro
args:
average: micro
- type: recall
name: Recall weighted
args:
average: weighted
---
# Dataset Card for Amazon Review Polarity
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://registry.opendata.aws/
- **Repository:** https://github.com/zhangxiangxiao/Crepe
- **Paper:** https://arxiv.org/abs/1509.01626
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Xiang Zhang](mailto:xiang.zhang@nyu.edu)
### Dataset Summary
The Amazon reviews dataset consists of reviews from amazon.
The data span a period of 18 years, including ~35 million reviews up to March 2013.
Reviews include product and user information, ratings, and a plaintext review.
### Supported Tasks and Leaderboards
- `text-classification`, `sentiment-classification`: The dataset is mainly used for text classification: given the content and the title, predict the correct star rating.
### Languages
Mainly English.
## Dataset Structure
### Data Instances
A typical data point, comprises of a title, a content and the corresponding label.
An example from the AmazonPolarity test set looks as follows:
```
{
'title':'Great CD',
'content':"My lovely Pat has one of the GREAT voices of her generation. I have listened to this CD for YEARS and I still LOVE IT. When I'm in a good mood it makes me feel better. A bad mood just evaporates like sugar in the rain. This CD just oozes LIFE. Vocals are jusat STUUNNING and lyrics just kill. One of life's hidden gems. This is a desert isle CD in my book. Why she never made it big is just beyond me. Everytime I play this, no matter black, white, young, old, male, female EVERYBODY says one thing ""Who was that singing ?""",
'label':1
}
```
### Data Fields
- 'title': a string containing the title of the review - escaped using double quotes (") and any internal double quote is escaped by 2 double quotes (""). New lines are escaped by a backslash followed with an "n" character, that is "\n".
- 'content': a string containing the body of the document - escaped using double quotes (") and any internal double quote is escaped by 2 double quotes (""). New lines are escaped by a backslash followed with an "n" character, that is "\n".
- 'label': either 1 (positive) or 0 (negative) rating.
### Data Splits
The Amazon reviews polarity dataset is constructed by taking review score 1 and 2 as negative, and 4 and 5 as positive. Samples of score 3 is ignored. Each class has 1,800,000 training samples and 200,000 testing samples.
## Dataset Creation
### Curation Rationale
The Amazon reviews polarity dataset is constructed by Xiang Zhang (xiang.zhang@nyu.edu). It is used as a text classification benchmark in the following paper: Xiang Zhang, Junbo Zhao, Yann LeCun. Character-level Convolutional Networks for Text Classification. Advances in Neural Information Processing Systems 28 (NIPS 2015).
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
Apache License 2.0
### Citation Information
McAuley, Julian, and Jure Leskovec. "Hidden factors and hidden topics: understanding rating dimensions with review text." In Proceedings of the 7th ACM conference on Recommender systems, pp. 165-172. 2013.
Xiang Zhang, Junbo Zhao, Yann LeCun. Character-level Convolutional Networks for Text Classification. Advances in Neural Information Processing Systems 28 (NIPS 2015)
### Contributions
Thanks to [@hfawaz](https://github.com/hfawaz) for adding this dataset. |
CyberHarem/chloe_von_einzbern_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of chloe_von_einzbern/クロエ・フォン・アインツベルン/克洛伊·冯·爱因兹贝伦 (Fate/Grand Order)
This is the dataset of chloe_von_einzbern/クロエ・フォン・アインツベルン/克洛伊·冯·爱因兹贝伦 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `dark_skin, long_hair, dark-skinned_female, pink_hair, breasts, orange_eyes, small_breasts, hair_between_eyes, hair_ornament, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 567.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_von_einzbern_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 510.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_von_einzbern_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1177 | 1003.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chloe_von_einzbern_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chloe_von_einzbern_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, solo, looking_at_viewer, collarbone, simple_background, white_background, bare_shoulders, blush, upper_body, yellow_eyes, smile, closed_mouth |
| 1 | 11 |  |  |  |  |  | 1girl, dual_wielding, kanshou_&_bakuya_(fate), looking_at_viewer, solo, holding_sword, yellow_eyes, midriff, smile, waist_cape, black_footwear, knee_boots, long_sleeves, navel, bridal_gauntlets, full_body, high_heels, panties, shrug_(clothing), thigh_strap |
| 2 | 15 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, navel, smile, waist_cape, black_panties, bridal_gauntlets, stomach_tattoo, long_sleeves, red_cape, shrug_(clothing), yellow_eyes, midriff, white_background, breastplate, half_updo, knee_boots, simple_background, black_footwear, single_hair_bun |
| 3 | 6 |  |  |  |  |  | 1girl, black_jacket, blush, collarbone, looking_at_viewer, open_jacket, short_shorts, smile, solo, twintails, bare_shoulders, black_shorts, off_shoulder, simple_background, closed_mouth, midriff, tank_top, white_background, black_camisole, black_footwear, boots, full_body, hair_bobbles, long_sleeves, navel, socks |
| 4 | 5 |  |  |  |  |  | 1girl, blush, hairpin, half_updo, looking_at_viewer, single_hair_bun, solo, ass, bare_shoulders, from_behind, looking_back, simple_background, thighs, smile, white_background, bike_shorts, black_shorts, black_sports_bra, butt_crack, closed_mouth, sitting, sweat, tongue_out |
| 5 | 8 |  |  |  |  |  | 1boy, 1girl, blush, open_mouth, half_updo, hetero, navel, nipples, simple_background, sweat, white_background, collarbone, completely_nude, hairpin, loli, solo_focus, stomach_tattoo, sex_from_behind, single_hair_bun, tongue_out, ass, censored, cum, looking_at_viewer, petite |
| 6 | 5 |  |  |  |  |  | 1boy, blush, hetero, navel, nipples, sex, spread_legs, stomach_tattoo, vaginal, 1girl, completely_nude, looking_at_viewer, one_side_up, penis, solo_focus, thighs, collarbone, cum_in_pussy, hair_bobbles, loli, open_mouth, aged_up, alternate_breast_size, bar_censor, girl_on_top, grin, heart-shaped_pupils, huge_breasts, leg_lift, lying, pubic_tattoo, straddling, testicles |
| 7 | 6 |  |  |  |  |  | 1girl, blush, collarbone, looking_at_viewer, navel, simple_background, smile, solo, stomach_tattoo, bare_shoulders, white_background, ass_visible_through_thighs, black_gloves, black_thighhighs, elbow_gloves, armpits, hairpin, micro_bikini, open_mouth, sweat |
| 8 | 8 |  |  |  |  |  | 1girl, blue_jacket, long_sleeves, looking_at_viewer, navel, solo, stomach_tattoo, belt, blush, open_jacket, smile, blue_skirt, choker, earrings, sunglasses, collarbone, eyewear_on_head, id_card, nail_polish, black_nails, simple_background, white_background, bikini, boots, cropped_jacket, hair_flower, midriff, open_mouth, thighs, umbrella |
| 9 | 12 |  |  |  |  |  | blush, looking_at_viewer, 1girl, animal_ear_fluff, bare_shoulders, smile, hair_bow, jingle_bell, paw_gloves, red_bow, black_skirt, collar, fur-trimmed_skirt, open_mouth, cat_ears, navel, ponytail, solo, cat_tail, hair_bell, o-ring, thigh_strap, thighs, stomach_tattoo, belt, black_thighhighs, fang, paw_shoes, simple_background |
| 10 | 6 |  |  |  |  |  | 1girl, black_leotard, blush, cosplay, covered_navel, elbow_gloves, solo, taimanin_suit, bare_shoulders, censored, covered_nipples, hairpin, highleg_leotard, single_hair_bun, spread_legs, thighhighs, black_gloves, half_updo, open_mouth, sweat, thigh_boots, black_footwear, fishnets, stomach_tattoo, tentacle_sex |
| 11 | 24 |  |  |  |  |  | looking_at_viewer, 1girl, one_side_up, homurahara_academy_school_uniform, solo, white_shirt, blush, neck_ribbon, puffy_short_sleeves, smile, black_skirt, pleated_skirt, white_sailor_collar, simple_background, white_background, closed_mouth, open_mouth, red_ribbon, collared_shirt |
| 12 | 8 |  |  |  |  |  | 1girl, blush, looking_at_viewer, smile, solo, chalkboard, classroom, indoors, school_desk, white_shirt, black_panties, crop_top, navel, sailor_collar, long_sleeves, neckerchief, serafuku, black_skirt, closed_mouth, looking_back, pantyhose, pleated_skirt, ponytail, thick_thighs |
| 13 | 8 |  |  |  |  |  | bare_shoulders, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, 1girl, blush, smile, highleg_leotard, red_leotard, solo, thighs, wrist_cuffs, ass, detached_collar, looking_back, simple_background, strapless_leotard, 2girls, bowtie, covered_navel, fishnet_pantyhose, rabbit_tail, thighhighs, tongue_out, white_background, yellow_eyes |
| 14 | 8 |  |  |  |  |  | blush, obi, 1girl, floral_print, smile, solo, long_sleeves, looking_at_viewer, red_kimono, wide_sleeves, open_mouth, hair_flower, new_year |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | collarbone | simple_background | white_background | bare_shoulders | blush | upper_body | yellow_eyes | smile | closed_mouth | dual_wielding | kanshou_&_bakuya_(fate) | holding_sword | midriff | waist_cape | black_footwear | knee_boots | long_sleeves | navel | bridal_gauntlets | full_body | high_heels | panties | shrug_(clothing) | thigh_strap | black_panties | stomach_tattoo | red_cape | breastplate | half_updo | single_hair_bun | black_jacket | open_jacket | short_shorts | twintails | black_shorts | off_shoulder | tank_top | black_camisole | boots | hair_bobbles | socks | hairpin | ass | from_behind | looking_back | thighs | bike_shorts | black_sports_bra | butt_crack | sitting | sweat | tongue_out | 1boy | open_mouth | hetero | nipples | completely_nude | loli | solo_focus | sex_from_behind | censored | cum | petite | sex | spread_legs | vaginal | one_side_up | penis | cum_in_pussy | aged_up | alternate_breast_size | bar_censor | girl_on_top | grin | heart-shaped_pupils | huge_breasts | leg_lift | lying | pubic_tattoo | straddling | testicles | ass_visible_through_thighs | black_gloves | black_thighhighs | elbow_gloves | armpits | micro_bikini | blue_jacket | belt | blue_skirt | choker | earrings | sunglasses | eyewear_on_head | id_card | nail_polish | black_nails | bikini | cropped_jacket | hair_flower | umbrella | animal_ear_fluff | hair_bow | jingle_bell | paw_gloves | red_bow | black_skirt | collar | fur-trimmed_skirt | cat_ears | ponytail | cat_tail | hair_bell | o-ring | fang | paw_shoes | black_leotard | cosplay | covered_navel | taimanin_suit | covered_nipples | highleg_leotard | thighhighs | thigh_boots | fishnets | tentacle_sex | homurahara_academy_school_uniform | white_shirt | neck_ribbon | puffy_short_sleeves | pleated_skirt | white_sailor_collar | red_ribbon | collared_shirt | chalkboard | classroom | indoors | school_desk | crop_top | sailor_collar | neckerchief | serafuku | pantyhose | thick_thighs | fake_animal_ears | playboy_bunny | rabbit_ears | red_leotard | wrist_cuffs | detached_collar | strapless_leotard | 2girls | bowtie | fishnet_pantyhose | rabbit_tail | obi | floral_print | red_kimono | wide_sleeves | new_year |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------|:--------------------|:-------------|:--------------------|:-------------------|:-----------------|:--------|:-------------|:--------------|:--------|:---------------|:----------------|:--------------------------|:----------------|:----------|:-------------|:-----------------|:-------------|:---------------|:--------|:-------------------|:------------|:-------------|:----------|:-------------------|:--------------|:----------------|:-----------------|:-----------|:--------------|:------------|:------------------|:---------------|:--------------|:---------------|:------------|:---------------|:---------------|:-----------|:-----------------|:--------|:---------------|:--------|:----------|:------|:--------------|:---------------|:---------|:--------------|:-------------------|:-------------|:----------|:--------|:-------------|:-------|:-------------|:---------|:----------|:------------------|:-------|:-------------|:------------------|:-----------|:------|:---------|:------|:--------------|:----------|:--------------|:--------|:---------------|:----------|:------------------------|:-------------|:--------------|:-------|:----------------------|:---------------|:-----------|:--------|:---------------|:-------------|:------------|:-----------------------------|:---------------|:-------------------|:---------------|:----------|:---------------|:--------------|:-------|:-------------|:---------|:-----------|:-------------|:------------------|:----------|:--------------|:--------------|:---------|:-----------------|:--------------|:-----------|:-------------------|:-----------|:--------------|:-------------|:----------|:--------------|:---------|:--------------------|:-----------|:-----------|:-----------|:------------|:---------|:-------|:------------|:----------------|:----------|:----------------|:----------------|:------------------|:------------------|:-------------|:--------------|:-----------|:---------------|:------------------------------------|:--------------|:--------------|:----------------------|:----------------|:----------------------|:-------------|:-----------------|:-------------|:------------|:----------|:--------------|:-----------|:----------------|:--------------|:-----------|:------------|:---------------|:-------------------|:----------------|:--------------|:--------------|:--------------|:------------------|:--------------------|:---------|:---------|:--------------------|:--------------|:------|:---------------|:-------------|:---------------|:-----------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | | | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | X | X | X | | X | X | | X | | X | X | | | | | X | X | X | X | X | X | X | | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | X | X | | | | X | | X | | X | X | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | | X | X | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | X | X | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | X | X | X | X | | X | | | | | | | | | | | | | X | | | | | | | | X | | | X | X | | | | | | | | | | | | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | X | X | | | | X | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | X | | | | | | | X | X | X | X | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | X | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 8 |  |  |  |  |  | X | X | X | X | X | X | | X | | | X | | | | | X | | | | X | X | | | | | | | | X | | | | | | X | | | | | | | X | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 12 |  |  |  |  |  | X | X | X | | X | | X | X | | | X | | | | | | | | | | X | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 6 |  |  |  |  |  | X | X | | | | | X | X | | | | | | | | | | X | | | | | | | | | | | X | | | X | X | | | | | | | | | | | | X | | | | | | | | | X | | | X | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 24 |  |  |  |  |  | X | X | X | | X | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 8 |  |  |  |  |  | X | X | X | | | | | X | | | X | X | | | | | | | | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | X | | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 13 | 8 |  |  |  |  |  | X | X | X | | X | X | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 14 | 8 |  |  |  |  |  | X | X | X | | | | | X | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X |
|
jamesagilesoda/ko-gen-sft-1444372 | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 8112211386
num_examples: 1444372
download_size: 3523230329
dataset_size: 8112211386
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sordonia/flan-debug-flat | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: task_name
dtype: string
- name: task_source
dtype: string
- name: template_type
dtype: string
- name: template_idx
dtype: int64
- name: split
dtype: string
splits:
- name: train
num_bytes: 28551282
num_examples: 18200
download_size: 12635228
dataset_size: 28551282
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "flan-debug-flat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mehdie/fine_tune_dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 85896
num_examples: 2594
download_size: 28599
dataset_size: 85896
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Unbabel__TowerInstruct-7B-v0.1 | ---
pretty_name: Evaluation run of Unbabel/TowerInstruct-7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Unbabel/TowerInstruct-7B-v0.1](https://huggingface.co/Unbabel/TowerInstruct-7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Unbabel__TowerInstruct-7B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T18:58:50.073000](https://huggingface.co/datasets/open-llm-leaderboard/details_Unbabel__TowerInstruct-7B-v0.1/blob/main/results_2024-01-13T18-58-50.073000.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4711217152311766,\n\
\ \"acc_stderr\": 0.03442367854889606,\n \"acc_norm\": 0.4757369265971281,\n\
\ \"acc_norm_stderr\": 0.03519302105112233,\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.42594704830683766,\n\
\ \"mc2_stderr\": 0.014921954316600566\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5110921501706485,\n \"acc_stderr\": 0.014607794914013048,\n\
\ \"acc_norm\": 0.5546075085324232,\n \"acc_norm_stderr\": 0.014523987638344076\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5993825931089425,\n\
\ \"acc_stderr\": 0.004890221012015062,\n \"acc_norm\": 0.789982075283808,\n\
\ \"acc_norm_stderr\": 0.004064885496003441\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4276315789473684,\n \"acc_stderr\": 0.04026097083296558,\n\
\ \"acc_norm\": 0.4276315789473684,\n \"acc_norm_stderr\": 0.04026097083296558\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4716981132075472,\n \"acc_stderr\": 0.0307235352490061,\n\
\ \"acc_norm\": 0.4716981132075472,\n \"acc_norm_stderr\": 0.0307235352490061\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n\
\ \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.4236111111111111,\n\
\ \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.3988439306358382,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918407,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918407\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.040406101782088394,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.040406101782088394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5193548387096775,\n \"acc_stderr\": 0.02842268740431211,\n \"\
acc_norm\": 0.5193548387096775,\n \"acc_norm_stderr\": 0.02842268740431211\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.30049261083743845,\n \"acc_stderr\": 0.032257994762334846,\n \"\
acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.032257994762334846\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.038049136539710114,\n\
\ \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.038049136539710114\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5909090909090909,\n \"acc_stderr\": 0.03502975799413007,\n \"\
acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.03502975799413007\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6632124352331606,\n \"acc_stderr\": 0.03410780251836184,\n\
\ \"acc_norm\": 0.6632124352331606,\n \"acc_norm_stderr\": 0.03410780251836184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.441025641025641,\n \"acc_stderr\": 0.02517404838400076,\n \
\ \"acc_norm\": 0.441025641025641,\n \"acc_norm_stderr\": 0.02517404838400076\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587193,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587193\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.40336134453781514,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.40336134453781514,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6770642201834862,\n \"acc_stderr\": 0.020048115923415325,\n \"\
acc_norm\": 0.6770642201834862,\n \"acc_norm_stderr\": 0.020048115923415325\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257013,\n \"\
acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257013\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5784313725490197,\n \"acc_stderr\": 0.03465868196380763,\n \"\
acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.03465868196380763\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6582278481012658,\n \"acc_stderr\": 0.030874537537553617,\n \
\ \"acc_norm\": 0.6582278481012658,\n \"acc_norm_stderr\": 0.030874537537553617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5426008968609866,\n\
\ \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.5426008968609866,\n\
\ \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"\
acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5337423312883436,\n \"acc_stderr\": 0.039194155450484096,\n\
\ \"acc_norm\": 0.5337423312883436,\n \"acc_norm_stderr\": 0.039194155450484096\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5436893203883495,\n \"acc_stderr\": 0.049318019942204146,\n\
\ \"acc_norm\": 0.5436893203883495,\n \"acc_norm_stderr\": 0.049318019942204146\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.688034188034188,\n\
\ \"acc_stderr\": 0.030351527323344927,\n \"acc_norm\": 0.688034188034188,\n\
\ \"acc_norm_stderr\": 0.030351527323344927\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.648786717752235,\n\
\ \"acc_stderr\": 0.017069982051499434,\n \"acc_norm\": 0.648786717752235,\n\
\ \"acc_norm_stderr\": 0.017069982051499434\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5317919075144508,\n \"acc_stderr\": 0.026864624366756646,\n\
\ \"acc_norm\": 0.5317919075144508,\n \"acc_norm_stderr\": 0.026864624366756646\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.0285803410651383,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.0285803410651383\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5852090032154341,\n\
\ \"acc_stderr\": 0.027982680459759563,\n \"acc_norm\": 0.5852090032154341,\n\
\ \"acc_norm_stderr\": 0.027982680459759563\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.027777777777777797,\n\
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.027777777777777797\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.029189805673587095,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.029189805673587095\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35853976531942633,\n\
\ \"acc_stderr\": 0.012248487319682741,\n \"acc_norm\": 0.35853976531942633,\n\
\ \"acc_norm_stderr\": 0.012248487319682741\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.43137254901960786,\n \"acc_stderr\": 0.020036393768352635,\n \
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.020036393768352635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4326530612244898,\n \"acc_stderr\": 0.03171752824062663,\n\
\ \"acc_norm\": 0.4326530612244898,\n \"acc_norm_stderr\": 0.03171752824062663\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6517412935323383,\n\
\ \"acc_stderr\": 0.033687874661154596,\n \"acc_norm\": 0.6517412935323383,\n\
\ \"acc_norm_stderr\": 0.033687874661154596\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n\
\ \"acc_stderr\": 0.03777798822748018,\n \"acc_norm\": 0.3795180722891566,\n\
\ \"acc_norm_stderr\": 0.03777798822748018\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.42594704830683766,\n\
\ \"mc2_stderr\": 0.014921954316600566\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998295\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1645185746777862,\n \
\ \"acc_stderr\": 0.010212173002763541\n }\n}\n```"
repo_url: https://huggingface.co/Unbabel/TowerInstruct-7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|arc:challenge|25_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|gsm8k|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hellaswag|10_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T18-58-50.073000.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T18-58-50.073000.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- '**/details_harness|winogrande|5_2024-01-13T18-58-50.073000.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T18-58-50.073000.parquet'
- config_name: results
data_files:
- split: 2024_01_13T18_58_50.073000
path:
- results_2024-01-13T18-58-50.073000.parquet
- split: latest
path:
- results_2024-01-13T18-58-50.073000.parquet
---
# Dataset Card for Evaluation run of Unbabel/TowerInstruct-7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Unbabel/TowerInstruct-7B-v0.1](https://huggingface.co/Unbabel/TowerInstruct-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Unbabel__TowerInstruct-7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T18:58:50.073000](https://huggingface.co/datasets/open-llm-leaderboard/details_Unbabel__TowerInstruct-7B-v0.1/blob/main/results_2024-01-13T18-58-50.073000.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4711217152311766,
"acc_stderr": 0.03442367854889606,
"acc_norm": 0.4757369265971281,
"acc_norm_stderr": 0.03519302105112233,
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.42594704830683766,
"mc2_stderr": 0.014921954316600566
},
"harness|arc:challenge|25": {
"acc": 0.5110921501706485,
"acc_stderr": 0.014607794914013048,
"acc_norm": 0.5546075085324232,
"acc_norm_stderr": 0.014523987638344076
},
"harness|hellaswag|10": {
"acc": 0.5993825931089425,
"acc_stderr": 0.004890221012015062,
"acc_norm": 0.789982075283808,
"acc_norm_stderr": 0.004064885496003441
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4276315789473684,
"acc_stderr": 0.04026097083296558,
"acc_norm": 0.4276315789473684,
"acc_norm_stderr": 0.04026097083296558
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4716981132075472,
"acc_stderr": 0.0307235352490061,
"acc_norm": 0.4716981132075472,
"acc_norm_stderr": 0.0307235352490061
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4236111111111111,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.4236111111111111,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918407,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918407
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.040406101782088394,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.040406101782088394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5193548387096775,
"acc_stderr": 0.02842268740431211,
"acc_norm": 0.5193548387096775,
"acc_norm_stderr": 0.02842268740431211
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.032257994762334846,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.032257994762334846
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6121212121212121,
"acc_stderr": 0.038049136539710114,
"acc_norm": 0.6121212121212121,
"acc_norm_stderr": 0.038049136539710114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6632124352331606,
"acc_stderr": 0.03410780251836184,
"acc_norm": 0.6632124352331606,
"acc_norm_stderr": 0.03410780251836184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.441025641025641,
"acc_stderr": 0.02517404838400076,
"acc_norm": 0.441025641025641,
"acc_norm_stderr": 0.02517404838400076
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02784081149587193,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02784081149587193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40336134453781514,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.40336134453781514,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6770642201834862,
"acc_stderr": 0.020048115923415325,
"acc_norm": 0.6770642201834862,
"acc_norm_stderr": 0.020048115923415325
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257013,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.030851992993257013
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.03465868196380763,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.03465868196380763
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6582278481012658,
"acc_stderr": 0.030874537537553617,
"acc_norm": 0.6582278481012658,
"acc_norm_stderr": 0.030874537537553617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5426008968609866,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.5426008968609866,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5337423312883436,
"acc_stderr": 0.039194155450484096,
"acc_norm": 0.5337423312883436,
"acc_norm_stderr": 0.039194155450484096
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5436893203883495,
"acc_stderr": 0.049318019942204146,
"acc_norm": 0.5436893203883495,
"acc_norm_stderr": 0.049318019942204146
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.688034188034188,
"acc_stderr": 0.030351527323344927,
"acc_norm": 0.688034188034188,
"acc_norm_stderr": 0.030351527323344927
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.648786717752235,
"acc_stderr": 0.017069982051499434,
"acc_norm": 0.648786717752235,
"acc_norm_stderr": 0.017069982051499434
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.026864624366756646,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.026864624366756646
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.0285803410651383,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.0285803410651383
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5852090032154341,
"acc_stderr": 0.027982680459759563,
"acc_norm": 0.5852090032154341,
"acc_norm_stderr": 0.027982680459759563
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.027777777777777797,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.027777777777777797
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.029189805673587095,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.029189805673587095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35853976531942633,
"acc_stderr": 0.012248487319682741,
"acc_norm": 0.35853976531942633,
"acc_norm_stderr": 0.012248487319682741
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.020036393768352635,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.020036393768352635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4326530612244898,
"acc_stderr": 0.03171752824062663,
"acc_norm": 0.4326530612244898,
"acc_norm_stderr": 0.03171752824062663
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6517412935323383,
"acc_stderr": 0.033687874661154596,
"acc_norm": 0.6517412935323383,
"acc_norm_stderr": 0.033687874661154596
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.03777798822748018,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.03777798822748018
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.42594704830683766,
"mc2_stderr": 0.014921954316600566
},
"harness|winogrande|5": {
"acc": 0.739542225730071,
"acc_stderr": 0.012334833671998295
},
"harness|gsm8k|5": {
"acc": 0.1645185746777862,
"acc_stderr": 0.010212173002763541
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
SnehitVaddi/Idiomology_Lama2_7B_Chat | ---
configs:
- config_name: default
data_files:
- split: train
path: "train.csv"
- split: test
path: "test.csv"
---
# Dataset Card for "Idiomolgy - Idiom Detection Dataset"
## Dataset Description
### General Information
This dataset is created for the purpose of training and evaluating language models, particularly focusing on their ability to identify idiomatic expressions within sentences. It aims to improve natural language understanding systems in recognizing idioms in varied contexts.
## Usage Example
```python
from datasets import load_dataset
# Load the dataset
dataset = load_dataset("SnehitVaddi/Idiomology_Lama2_7B_Chat")
```
### Dataset Structure
Entries in this dataset consist of sentences incorporating idioms, paired with annotations identifying the idiomatic expressions used. The dataset is organized into training and testing sets, adhering to an 80/20 split.
## How to Use the Dataset
### For Model Training
Utilize the training set to fine-tune language models on idiom detection tasks. Models should learn to accurately predict the idiom present in a given sentence context.
### For Model Evaluation
Employ the test set to assess the model's idiom identification capabilities. Model performance can be evaluated using standard metrics such as accuracy or F1 score.
## Dataset Creation
### Curation Rationale
The need for nuanced understanding of context and figurative language in natural language processing motivated the curation of this dataset, focusing on the challenge of idiom detection.
### Source
This dataset was generated from a compiled list of idioms and their illustrative sentences, designed to reflect real-life applications of these expressions.
## Dataset Structure
### Data Fields
- `sentence_with_idiom`: A sentence incorporating an idiom.
- `idiom_annotation`: The annotation of the idiom present in the sentence, provided in various phrasings to reflect natural language variation.
## Data Splits
- **Training Set**: Constitutes 80% of the dataset, intended for model training.
- **Test Set**: Makes up 20% of the dataset, used for evaluating model efficacy.
## Dataset Challenges
### Idiom Variability
The figurative nature and context-dependent usage of idioms introduce challenges in consistently identifying them within diverse sentences.
### Annotation Diversity
The dataset's varied phrasing styles for idiom annotations demand that models generalize across different expressions of the same idea.
|
open-llm-leaderboard/details_migtissera__Tess-7B-v2.0 | ---
pretty_name: Evaluation run of migtissera/Tess-7B-v2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Tess-7B-v2.0](https://huggingface.co/migtissera/Tess-7B-v2.0) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Tess-7B-v2.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T17:44:19.166742](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-7B-v2.0/blob/main/results_2024-03-27T17-44-19.166742.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5195754749045444,\n\
\ \"acc_stderr\": 0.03430091207460778,\n \"acc_norm\": 0.5253111971650539,\n\
\ \"acc_norm_stderr\": 0.03502554358823958,\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.44334331359677215,\n\
\ \"mc2_stderr\": 0.015404668572098906\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5247440273037542,\n \"acc_stderr\": 0.014593487694937736,\n\
\ \"acc_norm\": 0.5588737201365188,\n \"acc_norm_stderr\": 0.014509747749064663\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5763792073292173,\n\
\ \"acc_stderr\": 0.004931219148182242,\n \"acc_norm\": 0.7665803624775941,\n\
\ \"acc_norm_stderr\": 0.004221424792919217\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5660377358490566,\n \"acc_stderr\": 0.030503292013342592,\n\
\ \"acc_norm\": 0.5660377358490566,\n \"acc_norm_stderr\": 0.030503292013342592\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n\
\ \"acc_stderr\": 0.044895393502706986,\n \"acc_norm\": 0.3508771929824561,\n\
\ \"acc_norm_stderr\": 0.044895393502706986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.34656084656084657,\n \"acc_stderr\": 0.024508777521028417,\n \"\
acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.024508777521028417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6290322580645161,\n \"acc_stderr\": 0.027480541887953593,\n \"\
acc_norm\": 0.6290322580645161,\n \"acc_norm_stderr\": 0.027480541887953593\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406796,\n \"\
acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406796\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.03646204963253812,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.03646204963253812\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6839378238341969,\n \"acc_stderr\": 0.033553973696861736,\n\
\ \"acc_norm\": 0.6839378238341969,\n \"acc_norm_stderr\": 0.033553973696861736\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.025294608023986472,\n\
\ \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.025294608023986472\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5168067226890757,\n \"acc_stderr\": 0.03246013680375308,\n\
\ \"acc_norm\": 0.5168067226890757,\n \"acc_norm_stderr\": 0.03246013680375308\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.0399552400768168,\n \"acc_norm\"\
: 0.3973509933774834,\n \"acc_norm_stderr\": 0.0399552400768168\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6954128440366972,\n\
\ \"acc_stderr\": 0.01973229942035404,\n \"acc_norm\": 0.6954128440366972,\n\
\ \"acc_norm_stderr\": 0.01973229942035404\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n\
\ \"acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6176470588235294,\n \"acc_stderr\": 0.03410785338904719,\n \"\
acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.03410785338904719\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6919831223628692,\n \"acc_stderr\": 0.0300523893356057,\n \
\ \"acc_norm\": 0.6919831223628692,\n \"acc_norm_stderr\": 0.0300523893356057\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262971,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262971\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04750077341199985,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04750077341199985\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5276073619631901,\n \"acc_stderr\": 0.0392237829061099,\n\
\ \"acc_norm\": 0.5276073619631901,\n \"acc_norm_stderr\": 0.0392237829061099\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.04721188506097172,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.04721188506097172\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.024662496845209814,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.024662496845209814\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6130268199233716,\n\
\ \"acc_stderr\": 0.017417138059440146,\n \"acc_norm\": 0.6130268199233716,\n\
\ \"acc_norm_stderr\": 0.017417138059440146\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5867052023121387,\n \"acc_stderr\": 0.026511261369409247,\n\
\ \"acc_norm\": 0.5867052023121387,\n \"acc_norm_stderr\": 0.026511261369409247\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n\
\ \"acc_stderr\": 0.014530330201468643,\n \"acc_norm\": 0.25251396648044694,\n\
\ \"acc_norm_stderr\": 0.014530330201468643\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n\
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5709876543209876,\n \"acc_stderr\": 0.027538925613470863,\n\
\ \"acc_norm\": 0.5709876543209876,\n \"acc_norm_stderr\": 0.027538925613470863\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.02909767559946393,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.02909767559946393\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39113428943937417,\n\
\ \"acc_stderr\": 0.012463861839982063,\n \"acc_norm\": 0.39113428943937417,\n\
\ \"acc_norm_stderr\": 0.012463861839982063\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41911764705882354,\n \"acc_stderr\": 0.029972807170464626,\n\
\ \"acc_norm\": 0.41911764705882354,\n \"acc_norm_stderr\": 0.029972807170464626\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.545751633986928,\n \"acc_stderr\": 0.020142974553795205,\n \
\ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.020142974553795205\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670238,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670238\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5906432748538012,\n \"acc_stderr\": 0.037712831076265434,\n\
\ \"acc_norm\": 0.5906432748538012,\n \"acc_norm_stderr\": 0.037712831076265434\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.44334331359677215,\n\
\ \"mc2_stderr\": 0.015404668572098906\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6764009471191792,\n \"acc_stderr\": 0.013148883320923153\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2486732373009856,\n \
\ \"acc_stderr\": 0.011906147222879967\n }\n}\n```"
repo_url: https://huggingface.co/migtissera/Tess-7B-v2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|arc:challenge|25_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|gsm8k|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hellaswag|10_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-44-19.166742.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T17-44-19.166742.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- '**/details_harness|winogrande|5_2024-03-27T17-44-19.166742.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T17-44-19.166742.parquet'
- config_name: results
data_files:
- split: 2024_03_27T17_44_19.166742
path:
- results_2024-03-27T17-44-19.166742.parquet
- split: latest
path:
- results_2024-03-27T17-44-19.166742.parquet
---
# Dataset Card for Evaluation run of migtissera/Tess-7B-v2.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [migtissera/Tess-7B-v2.0](https://huggingface.co/migtissera/Tess-7B-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Tess-7B-v2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T17:44:19.166742](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-7B-v2.0/blob/main/results_2024-03-27T17-44-19.166742.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5195754749045444,
"acc_stderr": 0.03430091207460778,
"acc_norm": 0.5253111971650539,
"acc_norm_stderr": 0.03502554358823958,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.44334331359677215,
"mc2_stderr": 0.015404668572098906
},
"harness|arc:challenge|25": {
"acc": 0.5247440273037542,
"acc_stderr": 0.014593487694937736,
"acc_norm": 0.5588737201365188,
"acc_norm_stderr": 0.014509747749064663
},
"harness|hellaswag|10": {
"acc": 0.5763792073292173,
"acc_stderr": 0.004931219148182242,
"acc_norm": 0.7665803624775941,
"acc_norm_stderr": 0.004221424792919217
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5660377358490566,
"acc_stderr": 0.030503292013342592,
"acc_norm": 0.5660377358490566,
"acc_norm_stderr": 0.030503292013342592
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.044895393502706986,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.044895393502706986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.024508777521028417,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.024508777521028417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6290322580645161,
"acc_stderr": 0.027480541887953593,
"acc_norm": 0.6290322580645161,
"acc_norm_stderr": 0.027480541887953593
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406796,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406796
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.03646204963253812,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.03646204963253812
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6839378238341969,
"acc_stderr": 0.033553973696861736,
"acc_norm": 0.6839378238341969,
"acc_norm_stderr": 0.033553973696861736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.025294608023986472,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.025294608023986472
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228402,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228402
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5168067226890757,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.5168067226890757,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.0399552400768168,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.0399552400768168
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6954128440366972,
"acc_stderr": 0.01973229942035404,
"acc_norm": 0.6954128440366972,
"acc_norm_stderr": 0.01973229942035404
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.03410785338904719,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.03410785338904719
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6919831223628692,
"acc_stderr": 0.0300523893356057,
"acc_norm": 0.6919831223628692,
"acc_norm_stderr": 0.0300523893356057
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262971,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262971
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04750077341199985,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04750077341199985
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5276073619631901,
"acc_stderr": 0.0392237829061099,
"acc_norm": 0.5276073619631901,
"acc_norm_stderr": 0.0392237829061099
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.04721188506097172,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.04721188506097172
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.024662496845209814,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.024662496845209814
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6130268199233716,
"acc_stderr": 0.017417138059440146,
"acc_norm": 0.6130268199233716,
"acc_norm_stderr": 0.017417138059440146
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5867052023121387,
"acc_stderr": 0.026511261369409247,
"acc_norm": 0.5867052023121387,
"acc_norm_stderr": 0.026511261369409247
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468643,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468643
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5709876543209876,
"acc_stderr": 0.027538925613470863,
"acc_norm": 0.5709876543209876,
"acc_norm_stderr": 0.027538925613470863
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.02909767559946393,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.02909767559946393
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39113428943937417,
"acc_stderr": 0.012463861839982063,
"acc_norm": 0.39113428943937417,
"acc_norm_stderr": 0.012463861839982063
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41911764705882354,
"acc_stderr": 0.029972807170464626,
"acc_norm": 0.41911764705882354,
"acc_norm_stderr": 0.029972807170464626
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.020142974553795205,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.020142974553795205
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670238,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670238
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.030769444967296018,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.030769444967296018
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5906432748538012,
"acc_stderr": 0.037712831076265434,
"acc_norm": 0.5906432748538012,
"acc_norm_stderr": 0.037712831076265434
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.44334331359677215,
"mc2_stderr": 0.015404668572098906
},
"harness|winogrande|5": {
"acc": 0.6764009471191792,
"acc_stderr": 0.013148883320923153
},
"harness|gsm8k|5": {
"acc": 0.2486732373009856,
"acc_stderr": 0.011906147222879967
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tobaba2001/scs_phase2_ts_dataset6 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 7140888
num_examples: 10614
download_size: 162457
dataset_size: 7140888
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-college_chemistry-original-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 3641
num_examples: 9
download_size: 7082
dataset_size: 3641
---
# Dataset Card for "mmlu-college_chemistry-original-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
voidful/lfqa_eval | ---
dataset_info:
features:
- name: question
dtype: string
- name: input_info
dtype: string
- name: answer
dtype: string
splits:
- name: eil5
num_bytes: 265291.33893603133
num_examples: 300
- name: wikihownfqa
num_bytes: 323163.5233604801
num_examples: 300
- name: askh
num_bytes: 638261.7677181483
num_examples: 300
- name: aquamuse
num_bytes: 186636.99136868064
num_examples: 300
- name: asks
num_bytes: 393450.0
num_examples: 300
- name: stackxchange
num_bytes: 796344.5058889592
num_examples: 300
download_size: 1629681
dataset_size: 2603148.1272722995
---
# Dataset Card for "lfqa_eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FanChen0116/19100_chat_05x_slot | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-time
'2': B-date
'3': B-last_name
'4': B-people
'5': I-date
'6': I-people
'7': I-last_name
'8': I-first_name
'9': B-first_name
'10': B-time
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 5796
num_examples: 32
- name: validation
num_bytes: 5405
num_examples: 32
- name: test
num_bytes: 646729
num_examples: 3731
download_size: 0
dataset_size: 657930
---
# Dataset Card for "19100_chat_05x_slot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KenDoStudio/IRL_Artist_Miley_Cyrus | ---
license: mit
---
|
cvcio/toxic-el | ---
license: gpl-3.0
task_categories:
- text-classification
language:
- el
tags:
- text-classification
- toxicity
size_categories:
- 10K<n<100K
pretty_name: Toxic Tweets, Greek Dataset
multilinguality:
- monolingual
---
# Toxic Tweets, Greek Dataset
## Dataset Description
A frequent debate regarding research on media platforms revolves around the term of toxicity. However, the term itself is poorly defined and often contradictory, encompassing everything between online harassment and bullying to a negative commentary. In order to map online toxicity on Twitter we formed 7 different categories: i) hateful ii) insulting iii) threatening iv) racist v) sexist vi) using anti-refugee rhetoric vii) using nationalistic language (table 4). These categories were clustered into wider groups. Every tweet that contained at least one of the above mentioned categories was marked as toxic. Further on, tweets that included hateful language, threats and / or insults were marked as severe toxic. Tweets targeting an individual or a group based on identity characteristics such as gender, ethnic minority and / or religion were marked as identity hate.
### Dataset Curators
Published by Dimitris Papaevagelou (Civic Information Office), Ioanna Archontaki (Civic Information Office), Stefanos Loukopoulos (VouliWatch), Maria Nathanail (VouliWatch) and Konstantinos Mentzelos (VouliWatch).
### Annotation Process
[VouliWatch](https://vouliwatch.gr/), a Greek non-profit and non-partisan parliamentary monitoring organisation, currated the annotation process with 15 annotators that marked 112.000 tweets, resulting this dataset.
## Citation
```
@misc {civic_information_office_2023,
author = { {Civic Information Office} },
title = { toxic-el (Revision 65f60da) },
year = 2023,
url = { https://huggingface.co/datasets/cvcio/toxic-el },
doi = { 10.57967/hf/0744 },
publisher = { Hugging Face }
}
```
## Authors
Dimitris Papaevagelou (Civic Information Office) - [@andefined](https://huggingface.co/andefined)
## About
[Civic Information Office](https://cvcio.org/) is a Non Profit Organization based in Athens, Greece focusing on creating technology and research products for the public interest.
[VouliWatch](https://vouliwatch.gr/) is a Non Profit parliamentary monitoring and transparency watchdog organisation that promotes political integrity, engages Greek citizens with legislative politics and grants them with the opportunity to communicate, evaluate and hold elected representatives in the Greek and the European Parliament accountable.
|
Electrotubbie/classification_Turkic_languages | ---
task_categories:
- text-classification
language:
- ba
- kk
- ky
size_categories:
- 100K<n<1M
---
## Description
A dataset with texts and the categories to which these texts belong.
## Usage
This dataset can be used to check language models for the correct classification of texts by category.
## Dataset structure:
- **lang**: the language to which the text source belongs;
- **title**: the title of the text;
- **original_text**: original text taken from a web page;
- **processed_text**: processed text using preprocessing functions;
- **category**: the category to which the text belongs;
- **processed**: flag indicating that one or more sentence has been deleted from the text;
- **url**: link to the source;
- **date**: date of publication of the text;
## The creation process
This dataset was obtained by parsing news resources of countries and regions of native speakers of Turkic languages, such as Bashkir, Kazakh and Kyrgyz.
During parsing, it was a priori believed that the language of the articles was written in the language of the region about which the news was written.
After parsing, the text of the articles was processed through the preprocessing functions described on [github](https://github.com/Electrotubbie/turk_langs_analyse ).
The scheme of text preprocessing and validation is as follows:
- cleaning the text from unnecessary constructions using regular expressions;
- splitting text into sentences using the sentenize function of the razdel module;
- making predictions for each sentence using the lid.176.bin model, as well as the fasttext module;
- deleting sentences written in non-Turkic languages;
- combining valid sentences into text and getting the processed_text column. |
adamtappis/marketing_emails | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 20404
num_examples: 10
download_size: 24797
dataset_size: 20404
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "marketing_emails"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_vicgalleorg__OpenHermes-Yi-9B | ---
pretty_name: Evaluation run of vicgalleorg/OpenHermes-Yi-9B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [vicgalleorg/OpenHermes-Yi-9B](https://huggingface.co/vicgalleorg/OpenHermes-Yi-9B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vicgalleorg__OpenHermes-Yi-9B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-07T01:03:13.476946](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalleorg__OpenHermes-Yi-9B/blob/main/results_2024-03-07T01-03-13.476946.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6906750893244723,\n\
\ \"acc_stderr\": 0.030776345024466335,\n \"acc_norm\": 0.6966800698368959,\n\
\ \"acc_norm_stderr\": 0.03136866310374744,\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842892,\n \"mc2\": 0.4225455721872872,\n\
\ \"mc2_stderr\": 0.014721446205323074\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5665529010238908,\n \"acc_stderr\": 0.014481376224558902,\n\
\ \"acc_norm\": 0.606655290102389,\n \"acc_norm_stderr\": 0.014275101465693024\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5844453296156145,\n\
\ \"acc_stderr\": 0.004918102168717934,\n \"acc_norm\": 0.7872933678550089,\n\
\ \"acc_norm_stderr\": 0.004083855139469325\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.02749566368372405,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.02749566368372405\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.03295304696818317,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.03295304696818317\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n\
\ \"acc_stderr\": 0.04951218252396262,\n \"acc_norm\": 0.45098039215686275,\n\
\ \"acc_norm_stderr\": 0.04951218252396262\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7106382978723405,\n\
\ \"acc_stderr\": 0.02964400657700962,\n \"acc_norm\": 0.7106382978723405,\n\
\ \"acc_norm_stderr\": 0.02964400657700962\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n \
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.6896551724137931,\n \"acc_stderr\": 0.03855289616378949,\n \"\
acc_norm\": 0.6896551724137931,\n \"acc_norm_stderr\": 0.03855289616378949\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5952380952380952,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.5952380952380952,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5873015873015873,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.5873015873015873,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8483870967741935,\n \"acc_stderr\": 0.02040261665441676,\n \"\
acc_norm\": 0.8483870967741935,\n \"acc_norm_stderr\": 0.02040261665441676\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5714285714285714,\n \"acc_stderr\": 0.034819048444388045,\n \"\
acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.034819048444388045\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\"\
: 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8585858585858586,\n \"acc_stderr\": 0.02482590979334333,\n \"\
acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.02482590979334333\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.01871899852067817,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.01871899852067817\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7615384615384615,\n \"acc_stderr\": 0.021606294494647727,\n\
\ \"acc_norm\": 0.7615384615384615,\n \"acc_norm_stderr\": 0.021606294494647727\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8109243697478992,\n \"acc_stderr\": 0.025435119438105353,\n\
\ \"acc_norm\": 0.8109243697478992,\n \"acc_norm_stderr\": 0.025435119438105353\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849927,\n \"\
acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849927\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8587155963302753,\n \"acc_stderr\": 0.01493386898702808,\n \"\
acc_norm\": 0.8587155963302753,\n \"acc_norm_stderr\": 0.01493386898702808\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6851851851851852,\n \"acc_stderr\": 0.03167468706828979,\n \"\
acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.03167468706828979\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568617,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568617\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503228,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503228\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7354260089686099,\n\
\ \"acc_stderr\": 0.029605103217038332,\n \"acc_norm\": 0.7354260089686099,\n\
\ \"acc_norm_stderr\": 0.029605103217038332\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5892857142857143,\n\
\ \"acc_stderr\": 0.04669510663875192,\n \"acc_norm\": 0.5892857142857143,\n\
\ \"acc_norm_stderr\": 0.04669510663875192\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8390804597701149,\n\
\ \"acc_stderr\": 0.013140225515611729,\n \"acc_norm\": 0.8390804597701149,\n\
\ \"acc_norm_stderr\": 0.013140225515611729\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992,\n\
\ \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38100558659217876,\n\
\ \"acc_stderr\": 0.016242028834053616,\n \"acc_norm\": 0.38100558659217876,\n\
\ \"acc_norm_stderr\": 0.016242028834053616\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n\
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7845659163987139,\n\
\ \"acc_stderr\": 0.023350225475471442,\n \"acc_norm\": 0.7845659163987139,\n\
\ \"acc_norm_stderr\": 0.023350225475471442\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7654320987654321,\n \"acc_stderr\": 0.023576881744005723,\n\
\ \"acc_norm\": 0.7654320987654321,\n \"acc_norm_stderr\": 0.023576881744005723\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766002,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766002\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48370273794002605,\n\
\ \"acc_stderr\": 0.012763450734699814,\n \"acc_norm\": 0.48370273794002605,\n\
\ \"acc_norm_stderr\": 0.012763450734699814\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7058823529411765,\n \"acc_stderr\": 0.0184334276494019,\n \
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.0184334276494019\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399683,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399683\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.024112678240900794,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.024112678240900794\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.028782108105401705,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.028782108105401705\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842892,\n \"mc2\": 0.4225455721872872,\n\
\ \"mc2_stderr\": 0.014721446205323074\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663588\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4981046247156937,\n \
\ \"acc_stderr\": 0.013772385765569753\n }\n}\n```"
repo_url: https://huggingface.co/vicgalleorg/OpenHermes-Yi-9B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|arc:challenge|25_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|gsm8k|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hellaswag|10_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T01-03-13.476946.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T01-03-13.476946.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- '**/details_harness|winogrande|5_2024-03-07T01-03-13.476946.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-07T01-03-13.476946.parquet'
- config_name: results
data_files:
- split: 2024_03_07T01_03_13.476946
path:
- results_2024-03-07T01-03-13.476946.parquet
- split: latest
path:
- results_2024-03-07T01-03-13.476946.parquet
---
# Dataset Card for Evaluation run of vicgalleorg/OpenHermes-Yi-9B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vicgalleorg/OpenHermes-Yi-9B](https://huggingface.co/vicgalleorg/OpenHermes-Yi-9B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vicgalleorg__OpenHermes-Yi-9B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-07T01:03:13.476946](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalleorg__OpenHermes-Yi-9B/blob/main/results_2024-03-07T01-03-13.476946.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6906750893244723,
"acc_stderr": 0.030776345024466335,
"acc_norm": 0.6966800698368959,
"acc_norm_stderr": 0.03136866310374744,
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842892,
"mc2": 0.4225455721872872,
"mc2_stderr": 0.014721446205323074
},
"harness|arc:challenge|25": {
"acc": 0.5665529010238908,
"acc_stderr": 0.014481376224558902,
"acc_norm": 0.606655290102389,
"acc_norm_stderr": 0.014275101465693024
},
"harness|hellaswag|10": {
"acc": 0.5844453296156145,
"acc_stderr": 0.004918102168717934,
"acc_norm": 0.7872933678550089,
"acc_norm_stderr": 0.004083855139469325
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.02749566368372405,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.02749566368372405
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818317,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818317
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7106382978723405,
"acc_stderr": 0.02964400657700962,
"acc_norm": 0.7106382978723405,
"acc_norm_stderr": 0.02964400657700962
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6896551724137931,
"acc_stderr": 0.03855289616378949,
"acc_norm": 0.6896551724137931,
"acc_norm_stderr": 0.03855289616378949
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5952380952380952,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.5952380952380952,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5873015873015873,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.5873015873015873,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8483870967741935,
"acc_stderr": 0.02040261665441676,
"acc_norm": 0.8483870967741935,
"acc_norm_stderr": 0.02040261665441676
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.034819048444388045,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.034819048444388045
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8585858585858586,
"acc_stderr": 0.02482590979334333,
"acc_norm": 0.8585858585858586,
"acc_norm_stderr": 0.02482590979334333
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.01871899852067817,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.01871899852067817
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7615384615384615,
"acc_stderr": 0.021606294494647727,
"acc_norm": 0.7615384615384615,
"acc_norm_stderr": 0.021606294494647727
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8109243697478992,
"acc_stderr": 0.025435119438105353,
"acc_norm": 0.8109243697478992,
"acc_norm_stderr": 0.025435119438105353
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849927,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849927
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8587155963302753,
"acc_stderr": 0.01493386898702808,
"acc_norm": 0.8587155963302753,
"acc_norm_stderr": 0.01493386898702808
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.03167468706828979,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.03167468706828979
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568617,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503228,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503228
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7354260089686099,
"acc_stderr": 0.029605103217038332,
"acc_norm": 0.7354260089686099,
"acc_norm_stderr": 0.029605103217038332
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5892857142857143,
"acc_stderr": 0.04669510663875192,
"acc_norm": 0.5892857142857143,
"acc_norm_stderr": 0.04669510663875192
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8390804597701149,
"acc_stderr": 0.013140225515611729,
"acc_norm": 0.8390804597701149,
"acc_norm_stderr": 0.013140225515611729
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38100558659217876,
"acc_stderr": 0.016242028834053616,
"acc_norm": 0.38100558659217876,
"acc_norm_stderr": 0.016242028834053616
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7845659163987139,
"acc_stderr": 0.023350225475471442,
"acc_norm": 0.7845659163987139,
"acc_norm_stderr": 0.023350225475471442
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7654320987654321,
"acc_stderr": 0.023576881744005723,
"acc_norm": 0.7654320987654321,
"acc_norm_stderr": 0.023576881744005723
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766002,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766002
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48370273794002605,
"acc_stderr": 0.012763450734699814,
"acc_norm": 0.48370273794002605,
"acc_norm_stderr": 0.012763450734699814
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.0184334276494019,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.0184334276494019
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399683,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399683
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.024112678240900794,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.024112678240900794
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.028782108105401705,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.028782108105401705
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842892,
"mc2": 0.4225455721872872,
"mc2_stderr": 0.014721446205323074
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.011793015817663588
},
"harness|gsm8k|5": {
"acc": 0.4981046247156937,
"acc_stderr": 0.013772385765569753
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
BatsResearch/ctga-v1 | ---
configs:
- config_name: default
data_files:
- path: train/*.arrow
split: train
task_categories:
- text-generation
language:
- en
size_categories:
- 1M<n<10M
pretty_name: conditional task generation with attributes
---
# Dataset Card for ctga-v1
## Dataset Details
`ctga-v1` or conditional task generation with attributes is a new dataset created by remixing existing instruction tuning datasets ([P3](https://github.com/bigscience-workshop/promptsource)) to train [Bonito](https://huggingface.co/BatsResearch/bonito-v1).
```python3
from datasets import load_dataset
dataset = load_dataset("BatsResearch/ctga-v1")
```
### Dataset Description
- **Repository:** [Github Repo](https://github.com/BatsResearch/bonito)
- **Paper:** [Arxiv](TODO)
- **Point of Contact:** [Nihal V. Nayak](mailto:nnayak2@cs.brown.edu)
## Dataset Creation
The dataset is derived from [P3](https://github.com/bigscience-workshop/promptsource) by annotating 323 prompt templates from 39 datasets with 16 task types.
The prompt templates in P3 are remixed to create the meta-templates, which, in turn, generate the training examples.
The meta-template input has a task type (<|tasktype|>) as an attribute followed by the unannotated text or context (<|context|>).
The output of the meta-template comprises the attributed task with the prompt or task description and the context ({context}) followed by a pipe symbol (<|pipe|>) and the solution to the task.
We use the <|pipe|> symbol to separate the instruction and response pair that is used for adapting the downstream model.
### Data Instances
Each data instance contains the following features: _context_, _task_input_ _task_output_ _dataset_ _dataset_config_ _task_type_ _input_ and _output_.
The (_input_, _output_) is the pair we used to train Bonito model.
### Data Fields
- 'context': input context
- 'task_input': prompted input without context
- 'task_output': corrosponding output
- 'dataset': source dataset
- 'dataset_config': source dataset configuration
- 'task_type': corrsponding task type
- 'input': reformatted input
- 'output': reformatted output
### Source Data
All the datasets are sourced from the datasets library.
- Extractive Question Answering & Question Generation
- adversarial_qa/dbert
- adversarial_qa/dbidaf
- adversarial_qa/droberta
- duorc/ParaphraseRC
- duorc/SelfRC
- squad
- Topic Classification
- ag_news
- dbpedia_14
- hellaswag
- duorc/ParaphraseRC
- duorc/SelfRC
- squad
- Sentiment Analysis
- amazon_polarity
- imdb
- rotten_tomatoes
- yelp_review_full
- Natural Language Inference
- anli
- super_glue/cb
- Multiple-Choice Question Answering
- app_reviews
- cosmos_qa
- dream
- qasc
- quail
- quartz
- race/all
- social_i_qa
- super_glue/boolq
- super_glue/record
- wiki_hop/original
- Text Generation
- app_reviews
- cnn_dailymail/3.0.0
- dream
- duorc/ParaphraseRC
- duorc/SelfRC
- gigaword
- samsum
- Summarization
- cnn_dailymail/3.0.0
- duorc/ParaphraseRC
- duorc/SelfRC
- gigaword
- multi_newspaws/labeled_final
- samsum
- xsum
- Paraphrase Generation & Identification
- glue/mrpc
- multi_newspaws/labeled_final
- Yes-No Question Answering
- race/all
- social_i_qa
- super_glue/boolq
- Sentence Completion
- hellaswag
- super_glue/copa
- Textual Entailment
- super_glue/rte
- Word Sense Disambiguation
- super_glue/wic
- Coreference Resolution
- super_glue/wsc.fixed
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```
@article{bonito:arxiv24,
Author = {Nihal V. Nayak and Yiyang Nan and Avi Trost and Stephen H. Bach},
Title = {Learning to Generate Instruction Tuning Datasets for Zero-Shot Task Adaptation},
Volume = {arXiv:2402.18334 [cs.CL]},
Year = {2024}}
```
|
kaleemWaheed/twitter_dataset_1712991169 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 235247
num_examples: 615
download_size: 85946
dataset_size: 235247
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mteb/cqadupstack-android | ---
language:
- en
multilinguality:
- monolingual
task_categories:
- text-retrieval
source_datasets:
- cqadupstack-android
task_ids:
- document-retrieval
config_names:
- corpus
tags:
- text-retrieval
dataset_info:
- config_name: default
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: float64
splits:
- name: test
num_bytes: 43411
num_examples: 1696
- config_name: corpus
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: corpus
num_bytes: 14044469
num_examples: 22998
- config_name: queries
features:
- name: _id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_bytes: 45157
num_examples: 699
configs:
- config_name: default
data_files:
- split: test
path: qrels/test.jsonl
- config_name: corpus
data_files:
- split: corpus
path: corpus.jsonl
- config_name: queries
data_files:
- split: queries
path: queries.jsonl
--- |
seanghay/khPOS | ---
license: cc-by-nc-sa-4.0
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: pos_tags
sequence:
class_label:
names:
'0': AB
'1': AUX
'2': CC
'3': CD
'4': DBL
'5': DT
'6': ETC
'7': IN
'8': JJ
'9': KAN
'10': M
'11': NN
'12': PA
'13': PN
'14': PRO
'15': QT
'16': RB
'17': RPN
'18': SYM
'19': UH
'20': VB
'21': VB_JJ
'22': VCOM
splits:
- name: train
num_bytes: 3569524
num_examples: 12000
download_size: 2372205
dataset_size: 3569524
task_categories:
- text-classification
- text-generation
language:
- km
pretty_name: Khmer Part-of-Speech Corpus for Khmer NLP Research and Developments
size_categories:
- 10K<n<100K
---
> I am not the author of this dataset. [View on GitHub](https://github.com/ye-kyaw-thu/khPOS).
# khPOS (draft released 1.0)
khPOS (Khmer Part-of-Speech) Corpus for Khmer NLP Research and Developments
## Lincense
Creative Commons Attribution-NonCommercial-Share Alike 4.0 International (CC BY-NC-SA 4.0) License
[Details Info of License](https://creativecommons.org/licenses/by-nc-sa/4.0/)
## Introduction
The khPOS Corpus (Khmer POS Corpus) is a 12,000 sentences (25,626 words) manually word segmented and POS tagged corpus developed for Khmer language NLP research and developments. We collected Khmer sentences from websites that include various area such as economics, news, politics. Moreover it is also contained some student list and voter list of national election committee of Cambodia. The average number of words per sentence in the whole corpus is 10.75. Here, some symbols such as "។" (Khmer sign Khan), "៖" (Khmer sign Camnuc pii kuuh), "-", "?", "\[", "\]" etc. also counted as words. The shotest sentence contained only 1 word and longest sentence contained 169 words as follows (here, line number : Khmer sentence):
1814 : " ម៉ែ ឥត មាន ស្អប់_ខ្ពើម ឪពុក កូន ឯង ទេ ម៉ែ តែង នឹក មក កូន នឹង ឪពុក ឯង ពុំ មាន ភ្លេច ព្រម_ទាំង អ្នក\~ភូមិ ផង របង ជាមួយ ឯង ទៀត ដែល ម្ដាយ ធ្លាប់ នៅ ជាមួយ គេ ប៉ុន្តែ ម៉ែ ជាតិ ជា ទេព_ធីតា ពុំ អាច នៅ ជាមួយ មនុស្ស_លោក បាន យូរ ទេ រាល់ ថ្ងៃ ម៉ែ តែង ទៅ បំពេញ កិច្ច នៅ ចំពោះ មុខ ព្រះ\~ភក្ត្រ ព្រះ\~ឥន្ទ្រាធិរាជ គឺ សុំ អង្វរ ឲ្យ ព្រះ\~អង្គ ប្រទាន ពរ ដល់ កូន ឯង និង ឪពុក កូន ឯង កុំ បី ខាន មិន តែ ប៉ុណ្ណោះ ម្ដាយ បាន ទាំង ទូល សុំ ព្រះ\~ឥន្ទ្រ ឲ្យ ព្រះ\~អង្គ មេត្តា ផ្សាយ នូវ សុភ_មង្គល ដល់ មនុស្ស នៅ ឋាន នេះ ទូទៅ ផង កូន_ប្រុស ពន្លក ម្ដាយ ! ម្ដាយ ពុំ អាច នៅ ជាមួយ_នឹង កូន បាន ទៀត តែ ម្ដាយ យក កូន ឯង ទៅ លេង ប្រាសាទ ម្ដាយ ឯ ឋាន លើ មួយ ដង ម្ដាយ នឹង នាំ កូន ឯង ទៅ មុជ_ទឹក ក្នុង អាង ក្រអូប នៅ_ក្នុង សួន ព្រះ\~ឥន្ទ្រ ហើយ ទឹក នោះ នឹង ជម្រះ កាយ កូន ឯង ឲ្យ បាត់ ធំ ក្លិន មនុស្ស_លោក បន្ទាប់_ពី នោះ មក ម្ដាយ នឹង នាំ កូន ឯង ចូល ទៅ_ក្នុង ប្រាសាទ រួច នាំ កូន ឯង ទៅ ថ្វាយ_បង្រះ\~ឥន្ទ្រ " ។
## Word Segmentation
In Khmer texts, words composed of single or multiple syllables are usually not separated by white space. Spaces are used for easier reading and generally put between phrases, but there are no clear rules for using spaces in Khmer language. Therefore, word segmentation is a necessary prerequisite for POS tagging. Four classes of segment (word) types were observed during the manual segmentation of the corpus of Khmer text, each representing a different type of word, these were:
- Word Type 1: Single Words
- Word Type 2: Compound Words
- Word Type 3: Compound Words with Prefix
- Word Type 4: Compound Words with Suffix
For the detail information of the word segmentation rules and how we built a Khmer word segmentation model, please refer to our published paper (see Publiation Section).
## POS Tags
Part of speech is a category to which a word is assigned in accordance with its syntactic functions. In Khmer grammatical system, many linguists has defined their own POS according to their trend of research. Even though, many books are published, there are no standard agreement yet especially on number and name of POS tags. Comparing to English language, some English POS are not used in Khmer language, such as gerund, comparative and superlative adjectives, particle, etc. Based on CHOUN NATH dictionary, Khmer POS Tag set is defined. Some new POS tags that are not defined in the dictionary are added for considering word disambiguation task. Unlike English grammar, some Khmer sentences consist of more than one verb.
The definitions and descriptions of POS tags are presented in detail as follow:
1. Abbreviation (AB): For example, គម or គ.ម for kilometer (km), អសប for United Nation (UN), ពស or ព.ស for ពុទ សក ជ (Buddhism era), នប or ន.ប for នគរ ល (police), អហ or អ.ហ for វុធហត (Police Military) etc.
2. Adjective is a word used to modify or describe the noun. Adjective is usually at the right hand side of noun. There are very few adjectives that their positions are before noun. ក្រហម (red), កន្លះ (half), ប្លែក (strange), តូច (small), ល្អ (good), ស្អាត (beautiful) etc.
3. Adverb (RB): An adverb is a word that is used to modify verb, adjective or another adverb. For example, ណាស់ (very), ពុំ (not), ទើប (just), ពេកក្រៃ (very), ហើយ (already) etc.
4. Auxiliary Verb (AUX): Only three groups of verbs are tagged as auxiliary verb that used to make tense.
- Past form: បាន or មាន + Verb
- Progressive form: កំពុង + Verb
- Future form: នឹង + Verb
5. Cardinal Number (CD): A cardinal number is a word or a number that denoting the quality. For example, បី (three), ១០០ (100), ចតុ (four), ពាន់ (thousand), លាន (million) etc.
6. Conjunction (CC): Conjunction is a word to connect between words, phrases, and sentences. ក៏ប៉ុន្តែ (but), ពីព្រោះ (because), ដ្បិត (for, since), ទម្រាំតែ (until), ពុំនោះសោត (otherwise), បើ (if) etc.
7. Currency (CUR): CUR for currency symbol such as: ៛, \$, ₤, € etc.
8. Determiner Pronoun (DT): In Khmer grammar, determiners are classified under pronoun unlike English. It is used to tell location or/and uncertainty of noun. They are equivalent to English words: this, that, those, these, all, every, each, some etc. For example, នេះ (this), នោះ (that), ទាំងនេះ (these), ទាំងអស់ (all), នានា (various), ខ្លះ (some), សព្វ (every) etc.
9. Double Sign (DBL): Double sign (ៗ) is used to remind reader to read the previous word twice. For example, មនុស្ស/NN (people) គ្រប់/DT (every) ៗ/DBL គ្នា/PRO (person), "everybody" in English.
10. Et Cetera (ETC): ។ល។ is equal to et cetera (etc.) in English.
11. Full Stop (KAN): There are two full stops in Khmer language, ។ for sentence and ៕ for paragraph.
12. Interjection (UH): Word represents sound of animal, machine, and surprised sound. Interjections are always at the beginning of a sentence, and mostly followed by exclamation mark. For example, អូ (Oh!), ម៉េវ (Meow), អ៊ុះ (uh) etc.
13. Measure Word (M): Measure Words are classified to describe different quality corresponding class of noun. Some of these words can not be found in English. For example: ព្រះសង្គ/NN (monk) ២/CD (2) អង្គ/M (person), សំលៀកបំពាក់/NN (cloth) ១/CD (1), សម្រាប់/M (set), ឆ្កែ/NN (dog) ១/CD (1) ក្បាល/M (head) etc.
14. Noun (NN): A noun is a word or compound word that identifies a person, an animal, an object, an idea, a thing, etc. For example: ឡាន (Car), ការអភិវឌ្ឍន៍ (Development), សកម្មភាព (Action), ខ្មៅដៃ (Pencil), ទឹកកក (Ice) etc.
15. Particle (PA): We consider three types of particle and they are hesitation, response and final. For the two medial particle words ក៏ ("so, then, but" in English) and នូវ ("of, with" in English) \[1\], we consider them as RB and IN.
- Hesitation Particle: ខ្ញុំ (I) គិត (think) …អ៊ើ/PA (Er. . .) មិន (not) ឃើញ (see), ("I er… don’t think so" in English)
- Response Particle: អើ/PA (Hm, Ah) ខ្ញុំ (I) ដឹង (know) ហើយ (already), ("Hmm I already know" in English)
- Final Particle: There are some final particles such as ណា៎, សិន and ចុះ. Example usage of ណា៎: កុំ/RB (don't) ភ្លេច/VB (forget) ណា៎/PA, ("Hmm don't forget!" in English), Example usage of សិន: ចាំ/VB (wait) បន្តិច/RB (a while) សិន/PA, Example usage of ចុះ: ទៅ/VB (go) ចុះ/PA
16. Preposition (IN): Preposition is a word or a compound word that is used to connect two different words or phrases. It indicate the place, time, possession, relation etc. For example, ចំពោះ (to), ដល់ (to), ដើម្បី (in order to), ក្នុង (in), លើ (on), រវាង (between, around) etc.
17. Pronoun (PRO): A pronoun is a word that substitutes of a noun or a noun phrase. Those words are equivalent to Englis word: I, he, she, it, we, they, them, him, her etc. For example, ខ្ញុំ (I), គាត់ (he or she), យើង (we), ពួកយើង (our group or we), ខ្ញុំបាទ (polite form of I, me), ទូលបង្គំ (I, me for conversation with royal family) etc.
18. Proper Noun (PN): A proper noun is a noun that represents of a unique thing, for example, name of person, name of place and name of date etc. For example: សុខា (Sokha) ភ្នំពេញ (Phnom Penh), ថ្ងៃអង្គារ (Tuesday), កាល់តិច (Caltex), មេគង្គ (Mekong) etc.
19. Question Word (QT): In Khmer language, តើ is mostly used in the beginning of an interrogative sentence. For example,
តើ/QT អ្នក/PRO (you) ឈ្មោះ/NN (name) អ្វី/PRO (what)?, "What is your name?" in English.
20. Relative Pronoun (RPN): In Khmer language, there is only one relative pronoun. It is ដែល "that, which, where, who" in English.
21. Symbol (SYM): SYM for others sign or symbol such as: +, -, \*, \/, ៖, =, @, \#, \% etc.
22. VB\_JJ: VB\_JJ is a tag for an adjective which its original form is a Verb. Currently, there is no proposed POS tag name for such kind of Khmer words. Although we can use JJ tag, we want to clarify by using VB\_JJ POS tag for its function and also for semantic purpose. For example:
- The word សម្រាប់ (for) or ដើម្បី (to) is normally removed in both written and spoken Khmer.
កន្លែង/NN (place) សម្រាប់ (for) ធ្វើការ/VB\_JJ (working), office in English
ម៉ាស៊ីន/NN (Machine) សម្រាប់ (for) បោក/VB\_JJ (washing) ខោអាវ/NN (cloth), washing machine in English
ពួកគាត់/PRO (they) អាច/VB (can) មាន/VB (have) ការងារ/NN (work) ធ្វើ/VB\_JJ (to do)
- When Khmer Relative Pronoun is removed, the verb form keep the same as it was. It must be VB\_JJ it is no longer a Verb in subbordiante clause.
សិស្ស (student) ដែល (who) មាន/VB (has) ពិន្ទុ (mark) ខ្ពស់ (hight) នឹង (will) ទទួលបាន (get) អាហារូបករណ៍ (scholarship), student who has hight mark will get a scholarship in English but when ដែល who is removed, មាន/VB (has) should become មាន/VB\_JJ (having)
23. Verb (VB): Verb is a word that shows the action, even, and condition. Verb is a middle part of phrase. Normally, verb always need object and sometime it also need complement. For example, ស្តាប់ (listen), មានប្រសាសន៍ (say), ស្រលាញ់ (love), ច្រៀង (sing), បើកបរ (drive) etc.
24. Verb Complement (VCOM): Its original form is a verb, but it will turn into VCOM when two verbs in a sentence to emphasize the first verb. Especially, a compound verb is splitted by the word មិន (no or not), the first part is a verb and the second part is VCOM. For example, លក់ (sell) ដាច់/VCOM (a lot), ប្រលង (exam) មិន (no) ជាប់/VCOM (pass), ដេក/VB (sleep), មិន/RB (not) លក់/VCOM (sleep well) etc.
## Files/Scripts
Corpus-draft-ver-1.0/ (**_latest version_**)
**Scripts:**
mk-wordtag.pl : Perl script for printing word only file, tag only file, listing compound-words etc.
mk-pair.pl : Perl script for combining word file and tag file to word/tag format
**Data:**
data/ : Data preparation folder for incremental POS-tagging models
**Models:**
Two-Hours/: Incremental training (2,000 to 12,000 sentences) of 2hours annotation approach models with khPOS corpus.
Running logfile: [note.txt](https://github.com/ye-kyaw-thu/khPOS/blob/master/corpus-draft-ver-1.0/model/2hours/note.txt)
3gHMM/ : Incremental training (2,000 to 12,000 sentences) of 3-gram HMM (Hidden Markov Model) models with khPOS corpus.
Running logfile: [note.txt](https://github.com/ye-kyaw-thu/khPOS/blob/master/corpus-draft-ver-1.0/model/3gHMM/note.txt)
crf/ : Incremental training (2,000 to 12,000 sentences) of CRF POS-tagging models with khPOS corpus.
Running logfile: [note.txt](https://github.com/ye-kyaw-thu/khPOS/blob/master/corpus-draft-ver-1.0/model/crf/note.txt)
kytea/ : Incremental training (2,000 to 12,000 sentences) of L2 regularized SVM models with khPOS corpus.
Running logfile: [note](https://github.com/ye-kyaw-thu/khPOS/blob/master/corpus-draft-ver-1.0/model/kytea/note.txt)
maxent/ : Incremental training (2,000 to 12,000 sentences) of Maximum Entrophy models with khPOS corpus.
Running logfile: [note.txt](https://github.com/ye-kyaw-thu/khPOS/blob/master/corpus-draft-ver-1.0/model/maxent/note.txt)
rdr/ : Incremental training (2,000 to 12,000 sentences) of RDR (Ripple Down Rule-based) models with khPOS corpus.
Running logfile: [note.txt](https://github.com/ye-kyaw-thu/khPOS/blob/master/corpus-draft-ver-1.0/model/rdr/note.txt)
## Development and Support
Contributors
Vichet Chea
[Ye Kyaw Thu](https://sites.google.com/site/yekyawthunlp/)
## Acknowledgements
We would like to express our gratitude to Mr. Sorn Kea and Miss Leng Greyhuy for their help in POS tagging 12,100 sentences of Khmer Corpus manually.
## Publication
*Please cite following paper:*
Ye Kyaw Thu, Vichet Chea, Yoshinori Sagisaka, "Comparison of Six POS Tagging Methods on 12K Sentences Khmer Language POS Tagged Corpus", In the first Regional Conference on Optical character recognition and Natural language processing technologies for ASEAN languages (ONA 2017), December 7-8, 2017, Phnom Penh, Cambodia. [paper](https://github.com/ye-kyaw-thu/khPOS/blob/master/khpos.pdf)
## Reference
Vichet Chea, Ye Kyaw Thu, Chenchen Ding, Masao Utiyama, Andrew Finch and Eiichiro Sumita, "Khmer Word Segmentation Using Conditional Random Fields", In Khmer Natural Language Processing 2015 (KNLP2015), December 4, 2015, Phnom Penh, Cambodia.
[paper](http://khmernlp.org/2015/wp-content/uploads/2016/09/Paper-Khmer-Word-Segmentation-Using-.pdf)
Madeline Elizabeth. Ehrman, Kem Sos, Foreign Service Institute (U.S.), and Defense Language Institute (U.S.). Contemporary Cambodian: grammatical sketch, by Madeline E. Ehrman, with the assistance of Kem Sos. Foreign Service Institute, Dept. of State; \[for sale by the Supt. of Docs., U.S. Govt. Print. O .\] Washington, 1972. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.