datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
amitraheja82/Market_Mail_Synthetic_DataSet | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 21260
num_examples: 10
download_size: 25244
dataset_size: 21260
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Market_Mail_Synthetic_DataSet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DIAS123/vozingle | ---
license: openrail
---
|
lama | ---
pretty_name: 'LAMA: LAnguage Model Analysis'
annotations_creators:
- crowdsourced
- expert-generated
- machine-generated
language_creators:
- crowdsourced
- expert-generated
- machine-generated
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
- 1K<n<10K
- 1M<n<10M
- n<1K
source_datasets:
- extended|conceptnet5
- extended|squad
task_categories:
- text-retrieval
- text-classification
task_ids:
- fact-checking-retrieval
- text-scoring
paperswithcode_id: lama
tags:
- probing
dataset_info:
- config_name: trex
features:
- name: uuid
dtype: string
- name: obj_uri
dtype: string
- name: obj_label
dtype: string
- name: sub_uri
dtype: string
- name: sub_label
dtype: string
- name: predicate_id
dtype: string
- name: sub_surface
dtype: string
- name: obj_surface
dtype: string
- name: masked_sentence
dtype: string
- name: template
dtype: string
- name: template_negated
dtype: string
- name: label
dtype: string
- name: description
dtype: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 656913189
num_examples: 1304391
download_size: 74652201
dataset_size: 656913189
- config_name: squad
features:
- name: id
dtype: string
- name: sub_label
dtype: string
- name: obj_label
dtype: string
- name: negated
dtype: string
- name: masked_sentence
dtype: string
splits:
- name: train
num_bytes: 57188
num_examples: 305
download_size: 74639115
dataset_size: 57188
- config_name: google_re
features:
- name: pred
dtype: string
- name: sub
dtype: string
- name: obj
dtype: string
- name: evidences
dtype: string
- name: judgments
dtype: string
- name: sub_w
dtype: string
- name: sub_label
dtype: string
- name: sub_aliases
dtype: string
- name: obj_w
dtype: string
- name: obj_label
dtype: string
- name: obj_aliases
dtype: string
- name: uuid
dtype: string
- name: masked_sentence
dtype: string
- name: template
dtype: string
- name: template_negated
dtype: string
splits:
- name: train
num_bytes: 7638657
num_examples: 6106
download_size: 74639115
dataset_size: 7638657
- config_name: conceptnet
features:
- name: uuid
dtype: string
- name: sub
dtype: string
- name: obj
dtype: string
- name: pred
dtype: string
- name: obj_label
dtype: string
- name: masked_sentence
dtype: string
- name: negated
dtype: string
splits:
- name: train
num_bytes: 4130000
num_examples: 29774
download_size: 74639115
dataset_size: 4130000
config_names:
- conceptnet
- google_re
- squad
- trex
---
# Dataset Card for LAMA: LAnguage Model Analysis - a dataset for probing and analyzing the factual and commonsense knowledge contained in pretrained language models.
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
https://github.com/facebookresearch/LAMA
- **Repository:**
https://github.com/facebookresearch/LAMA
- **Paper:**
@inproceedings{petroni2019language,
title={Language Models as Knowledge Bases?},
author={F. Petroni, T. Rockt{\"{a}}schel, A. H. Miller, P. Lewis, A. Bakhtin, Y. Wu and S. Riedel},
booktitle={In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2019},
year={2019}
}
@inproceedings{petroni2020how,
title={How Context Affects Language Models' Factual Predictions},
author={Fabio Petroni and Patrick Lewis and Aleksandra Piktus and Tim Rockt{\"a}schel and Yuxiang Wu and Alexander H. Miller and Sebastian Riedel},
booktitle={Automated Knowledge Base Construction},
year={2020},
url={https://openreview.net/forum?id=025X0zPfn}
}
### Dataset Summary
This dataset provides the data for LAMA. The dataset include a subset
of Google_RE
(https://code.google.com/archive/p/relation-extraction-corpus/), TRex
(subset of wikidata triples), Conceptnet
(https://github.com/commonsense/conceptnet5/wiki) and Squad. There are
configs for each of "google_re", "trex", "conceptnet" and "squad",
respectively.
The dataset includes some cleanup, and addition of a masked sentence
and associated answers for the [MASK] token. The accuracy in
predicting the [MASK] token shows how well the language model knows
facts and common sense information. The [MASK] tokens are only for the
"object" slots.
This version of the dataset includes "negated" sentences as well as
the masked sentence. Also, certain of the config includes "template"
and "template_negated" fields of the form "[X] some text [Y]", where
[X] and [Y] are the subject and object slots respectively of certain
relations.
See the paper for more details. For more information, also see:
https://github.com/facebookresearch/LAMA
### Languages
en
## Dataset Structure
### Data Instances
The trex config has the following fields:
``
{'description': 'the item (an institution, law, public office ...) or statement belongs to or has power over or applies to the value (a territorial jurisdiction: a country, state, municipality, ...)', 'label': 'applies to jurisdiction', 'masked_sentence': 'It is known as a principality as it is a monarchy headed by two Co-Princes – the Spanish/Roman Catholic Bishop of Urgell and the President of [MASK].', 'obj_label': 'France', 'obj_surface': 'France', 'obj_uri': 'Q142', 'predicate_id': 'P1001', 'sub_label': 'president of the French Republic', 'sub_surface': 'President', 'sub_uri': 'Q191954', 'template': '[X] is a legal term in [Y] .', 'template_negated': '[X] is not a legal term in [Y] .', 'type': 'N-M', 'uuid': '3fe3d4da-9df9-45ba-8109-784ce5fba38a'}
``
The conceptnet config has the following fields:
``
{'masked_sentence': 'One of the things you do when you are alive is [MASK].', 'negated': '', 'obj': 'think', 'obj_label': 'think', 'pred': 'HasSubevent', 'sub': 'alive', 'uuid': 'd4f11631dde8a43beda613ec845ff7d1'}
``
The squad config has the following fields:
``
{'id': '56be4db0acb8001400a502f0_0', 'masked_sentence': 'To emphasize the 50th anniversary of the Super Bowl the [MASK] color was used.', 'negated': "['To emphasize the 50th anniversary of the Super Bowl the [MASK] color was not used.']", 'obj_label': 'gold', 'sub_label': 'Squad'}
``
The google_re config has the following fields:
``
{'evidences': '[{\'url\': \'http://en.wikipedia.org/wiki/Peter_F._Martin\', \'snippet\': "Peter F. Martin (born 1941) is an American politician who is a Democratic member of the Rhode Island House of Representatives. He has represented the 75th District Newport since 6 January 2009. He is currently serves on the House Committees on Judiciary, Municipal Government, and Veteran\'s Affairs. During his first term of office he served on the House Committees on Small Business and Separation of Powers & Government Oversight. In August 2010, Representative Martin was appointed as a Commissioner on the Atlantic States Marine Fisheries Commission", \'considered_sentences\': [\'Peter F Martin (born 1941) is an American politician who is a Democratic member of the Rhode Island House of Representatives .\']}]', 'judgments': "[{'rater': '18349444711114572460', 'judgment': 'yes'}, {'rater': '17595829233063766365', 'judgment': 'yes'}, {'rater': '4593294093459651288', 'judgment': 'yes'}, {'rater': '7387074196865291426', 'judgment': 'yes'}, {'rater': '17154471385681223613', 'judgment': 'yes'}]", 'masked_sentence': 'Peter F Martin (born [MASK]) is an American politician who is a Democratic member of the Rhode Island House of Representatives .', 'obj': '1941', 'obj_aliases': '[]', 'obj_label': '1941', 'obj_w': 'None', 'pred': '/people/person/date_of_birth', 'sub': '/m/09gb0bw', 'sub_aliases': '[]', 'sub_label': 'Peter F. Martin', 'sub_w': 'None', 'template': '[X] (born [Y]).', 'template_negated': '[X] (not born [Y]).', 'uuid': '18af2dac-21d3-4c42-aff5-c247f245e203'}
``
### Data Fields
The trex config has the following fields:
* uuid: the id
* obj_uri: a uri for the object slot
* obj_label: a label for the object slot
* sub_uri: a uri for the subject slot
* sub_label: a label for the subject slot
* predicate_id: the predicate/relationship
* sub_surface: the surface text for the subject
* obj_surface: The surface text for the object. This is the word that should be predicted by the [MASK] token.
* masked_sentence: The masked sentence used to probe, with the object word replaced with [MASK]
* template: A pattern of text for extracting the relationship, object and subject of the form "[X] some text [Y]", where [X] and [Y] are the subject and object slots respectively. template may be missing and replaced with an empty string.
* template_negated: Same as above, except the [Y] is not the object. template_negated may be missing and replaced with empty strings.
* label: the label for the relationship/predicate. label may be missing and replaced with an empty string.
* description': a description of the relationship/predicate. description may be missing and replaced with an empty string.
* type: a type id for the relationship/predicate. type may be missing and replaced with an empty string.
The conceptnet config has the following fields:
* uuid: the id
* sub: the subject. subj may be missing and replaced with an empty string.
* obj: the object to be predicted. obj may be missing and replaced with an empty string.
* pred: the predicate/relationship
* obj_label: the object label
* masked_sentence: The masked sentence used to probe, with the object word replaced with [MASK]
* negated: same as above, except [MASK] is replaced by something that is not the object word. negated may be missing and replaced with empty strings.
The squad config has the following fields:
* id: the id
* sub_label: the subject label
* obj_label: the object label that is being predicted
* masked_sentence: The masked sentence used to probe, with the object word replaced with [MASK]
* negated: same as above, except [MASK] is replaced by something that is not the object word. negated may be missing and replaced with empty strings.
The google_re config has the following fields:
* uuid: the id
* pred: the predicate
* sub: the subject. subj may be missing and replaced with an empty string.
* obj: the object. obj may be missing and replaced with an empty string.
* evidences: flattened json string that provides evidence for predicate. parse this json string to get more 'snippet' information.
* judgments: data about judgments
* sub_q: unknown
* sub_label: label for the subject
* sub_aliases: unknown
* obj_w: unknown
* obj_label: label for the object
* obj_aliases: unknown
* masked_sentence: The masked sentence used to probe, with the object word replaced with [MASK]
* template: A pattern of text for extracting the relationship, object and subject of the form "[X] some text [Y]", where [X] and [Y] are the subject and object slots respectively.
* template_negated: Same as above, except the [Y] is not the object.
### Data Splits
There are no data splits.
## Dataset Creation
### Curation Rationale
This dataset was gathered and created to probe what language models understand.
### Source Data
#### Initial Data Collection and Normalization
See the reaserch paper and website for more detail. The dataset was
created gathered from various other datasets with cleanups for probing.
#### Who are the source language producers?
The LAMA authors and the original authors of the various configs.
### Annotations
#### Annotation process
Human annotations under the original datasets (conceptnet), and various machine annotations.
#### Who are the annotators?
Human annotations and machine annotations.
### Personal and Sensitive Information
Unkown, but likely names of famous people.
## Considerations for Using the Data
### Social Impact of Dataset
The goal for the work is to probe the understanding of language models.
### Discussion of Biases
Since the data is from human annotators, there is likely to be baises.
[More Information Needed]
### Other Known Limitations
The original documentation for the datafields are limited.
## Additional Information
### Dataset Curators
The authors of LAMA at Facebook and the authors of the original datasets.
### Licensing Information
The Creative Commons Attribution-Noncommercial 4.0 International License. see https://github.com/facebookresearch/LAMA/blob/master/LICENSE
### Citation Information
@inproceedings{petroni2019language,
title={Language Models as Knowledge Bases?},
author={F. Petroni, T. Rockt{\"{a}}schel, A. H. Miller, P. Lewis, A. Bakhtin, Y. Wu and S. Riedel},
booktitle={In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2019},
year={2019}
}
@inproceedings{petroni2020how,
title={How Context Affects Language Models' Factual Predictions},
author={Fabio Petroni and Patrick Lewis and Aleksandra Piktus and Tim Rockt{\"a}schel and Yuxiang Wu and Alexander H. Miller and Sebastian Riedel},
booktitle={Automated Knowledge Base Construction},
year={2020},
url={https://openreview.net/forum?id=025X0zPfn}
}
### Contributions
Thanks to [@ontocord](https://github.com/ontocord) for adding this dataset. |
JovialValley/phoneme_totaldataset_2 | ---
dataset_info:
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: label
dtype: string
- name: emotion
dtype: string
- name: emotion_str
dtype: string
splits:
- name: train
num_bytes: 163385611.0
num_examples: 390
- name: test
num_bytes: 41691832.0
num_examples: 97
download_size: 138543168
dataset_size: 205077443.0
---
# Dataset Card for "phoneme_totaldataset_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/random25eof_find_passage_train1000_eval1000_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 298700
num_examples: 3000
- name: validation
num_bytes: 118222
num_examples: 1000
download_size: 181208
dataset_size: 416922
---
# Dataset Card for "random25eof_find_passage_train1000_eval1000_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
datahrvoje/twitter_dataset_1713147651 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 20239
num_examples: 45
download_size: 12303
dataset_size: 20239
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
surrey-nlp/PLOD-filtered | ---
annotations_creators:
- Leonardo Zilio, Hadeel Saadany, Prashant Sharma, Diptesh Kanojia, Constantin Orasan
language_creators:
- found
language:
- en
license: cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- token-classification
task_ids: []
paperswithcode_id: plod-filtered
pretty_name: 'PLOD: An Abbreviation Detection Dataset'
tags:
- abbreviation-detection
---
# PLOD: An Abbreviation Detection Dataset
This is the repository for PLOD Dataset published at LREC 2022. The dataset can help build sequence labelling models for the task Abbreviation Detection.
### Dataset
We provide two variants of our dataset - Filtered and Unfiltered. They are described in our paper here.
1. The Filtered version can be accessed via [Huggingface Datasets here](https://huggingface.co/datasets/surrey-nlp/PLOD-filtered) and a [CONLL format is present here](https://github.com/surrey-nlp/PLOD-AbbreviationDetection).<br/>
2. The Unfiltered version can be accessed via [Huggingface Datasets here](https://huggingface.co/datasets/surrey-nlp/PLOD-unfiltered) and a [CONLL format is present here](https://github.com/surrey-nlp/PLOD-AbbreviationDetection).<br/>
3. The [SDU Shared Task](https://sites.google.com/view/sdu-aaai22/home) data we use for zero-shot testing is [available here](https://huggingface.co/datasets/surrey-nlp/SDU-test).
# Dataset Card for PLOD-filtered
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [Needs More Information]
- **Repository:** https://github.com/surrey-nlp/PLOD-AbbreviationDetection
- **Paper:** https://arxiv.org/abs/2204.12061
- **Leaderboard:** https://paperswithcode.com/sota/abbreviationdetection-on-plod-filtered
- **Point of Contact:** [Diptesh Kanojia](mailto:d.kanojia@surrey.ac.uk)
### Dataset Summary
This PLOD Dataset is an English-language dataset of abbreviations and their long-forms tagged in text. The dataset has been collected for research from the PLOS journals indexing of abbreviations and long-forms in the text. This dataset was created to support the Natural Language Processing task of abbreviation detection and covers the scientific domain.
### Supported Tasks and Leaderboards
This dataset primarily supports the Abbreviation Detection Task. It has also been tested on a train+dev split provided by the Acronym Detection Shared Task organized as a part of the Scientific Document Understanding (SDU) workshop at AAAI 2022.
### Languages
English
## Dataset Structure
### Data Instances
A typical data point comprises an ID, a set of `tokens` present in the text, a set of `pos_tags` for the corresponding tokens obtained via Spacy NER, and a set of `ner_tags` which are limited to `AC` for `Acronym` and `LF` for `long-forms`.
An example from the dataset:
{'id': '1',
'tokens': ['Study', '-', 'specific', 'risk', 'ratios', '(', 'RRs', ')', 'and', 'mean', 'BW', 'differences', 'were', 'calculated', 'using', 'linear', 'and', 'log', '-', 'binomial', 'regression', 'models', 'controlling', 'for', 'confounding', 'using', 'inverse', 'probability', 'of', 'treatment', 'weights', '(', 'IPTW', ')', 'truncated', 'at', 'the', '1st', 'and', '99th', 'percentiles', '.'],
'pos_tags': [8, 13, 0, 8, 8, 13, 12, 13, 5, 0, 12, 8, 3, 16, 16, 0, 5, 0, 13, 0, 8, 8, 16, 1, 8, 16, 0, 8, 1, 8, 8, 13, 12, 13, 16, 1, 6, 0, 5, 0, 8, 13],
'ner_tags': [0, 0, 0, 3, 4, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 3, 4, 4, 4, 4, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0]
}
### Data Fields
- id: the row identifier for the dataset point.
- tokens: The tokens contained in the text.
- pos_tags: the Part-of-Speech tags obtained for the corresponding token above from Spacy NER.
- ner_tags: The tags for abbreviations and long-forms.
### Data Splits
| | Train | Valid | Test |
| ----- | ------ | ----- | ---- |
| Filtered | 112652 | 24140 | 24140|
| Unfiltered | 113860 | 24399 | 24399|
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
Extracting the data from PLOS Journals online and then tokenization, normalization.
#### Who are the source language producers?
PLOS Journal
## Additional Information
### Dataset Curators
The dataset was initially created by Leonardo Zilio, Hadeel Saadany, Prashant Sharma,
Diptesh Kanojia, Constantin Orasan.
### Licensing Information
CC-BY-SA 4.0
### Citation Information
[Needs More Information]
### Installation
We use the custom NER pipeline in the [spaCy transformers](https://spacy.io/universe/project/spacy-transformers) library to train our models. This library supports training via any pre-trained language models available at the :rocket: [HuggingFace repository](https://huggingface.co/).<br/>
Please see the instructions at these websites to setup your own custom training with our dataset to reproduce the experiments using Spacy.
OR<br/>
However, you can also reproduce the experiments via the Python notebook we [provide here](https://github.com/surrey-nlp/PLOD-AbbreviationDetection/blob/main/nbs/fine_tuning_abbr_det.ipynb) which uses HuggingFace Trainer class to perform the same experiments. The exact hyperparameters can be obtained from the models readme cards linked below. Before starting, please perform the following steps:
```bash
git clone https://github.com/surrey-nlp/PLOD-AbbreviationDetection
cd PLOD-AbbreviationDetection
pip install -r requirements.txt
```
Now, you can use the notebook to reproduce the experiments.
### Model(s)
Our best performing models are hosted on the HuggingFace models repository
| Models | [`PLOD - Unfiltered`](https://huggingface.co/datasets/surrey-nlp/PLOD-unfiltered) | [`PLOD - Filtered`](https://huggingface.co/datasets/surrey-nlp/PLOD-filtered) | Description |
| --- | :---: | :---: | --- |
| [RoBERTa<sub>large</sub>](https://huggingface.co/roberta-large) | [RoBERTa<sub>large</sub>-finetuned-abbr](https://huggingface.co/surrey-nlp/roberta-large-finetuned-abbr) | -soon- | Fine-tuning on the RoBERTa<sub>large</sub> language model |
| [RoBERTa<sub>base</sub>](https://huggingface.co/roberta-base) | -soon- | [RoBERTa<sub>base</sub>-finetuned-abbr](https://huggingface.co/surrey-nlp/roberta-large-finetuned-abbr) | Fine-tuning on the RoBERTa<sub>base</sub> language model |
| [AlBERT<sub>large-v2</sub>](https://huggingface.co/albert-large-v2) | [AlBERT<sub>large-v2</sub>-finetuned-abbDet](https://huggingface.co/surrey-nlp/albert-large-v2-finetuned-abbDet) | -soon- | Fine-tuning on the AlBERT<sub>large-v2</sub> language model |
On the link provided above, the model(s) can be used with the help of the Inference API via the web-browser itself. We have placed some examples with the API for testing.<br/>
### Usage
You can use the HuggingFace Model link above to find the instructions for using this model in Python locally using the notebook provided in the Git repo.
|
jose-h-solorzano/synth-forgetting-generalization-10 | ---
dataset_info:
features:
- name: input
sequence: float64
- name: output
sequence: float64
splits:
- name: train
num_bytes: 16320000.0
num_examples: 40000
- name: test
num_bytes: 4080000.0
num_examples: 10000
download_size: 14193106
dataset_size: 20400000.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Sandipan1994/Inference_Attribute_Prediction | ---
dataset_info:
features:
- name: step
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 4985662
num_examples: 23330
- name: test
num_bytes: 1319178
num_examples: 6216
- name: validation
num_bytes: 739704
num_examples: 3403
download_size: 1123248
dataset_size: 7044544
---
# Dataset Card for "Inference_Attribute_Prediction"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cahya/instructions-fi | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 936328.3665338645
num_examples: 1807
- name: test
num_bytes: 52334.900398406375
num_examples: 101
- name: validation
num_bytes: 51816.73306772908
num_examples: 100
download_size: 640961
dataset_size: 1040480.0
---
# Dataset Card for "instructions-fi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.4 | ---
pretty_name: Evaluation run of jondurbin/airoboros-13b-gpt4-1.4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-13b-gpt4-1.4](https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T02:48:34.723506](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.4/blob/main/results_2023-10-23T02-48-34.723506.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.05285234899328859,\n\
\ \"em_stderr\": 0.0022912930700355423,\n \"f1\": 0.11820364932885902,\n\
\ \"f1_stderr\": 0.0026017641356238645,\n \"acc\": 0.41988112541310807,\n\
\ \"acc_stderr\": 0.009659506214512746\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.05285234899328859,\n \"em_stderr\": 0.0022912930700355423,\n\
\ \"f1\": 0.11820364932885902,\n \"f1_stderr\": 0.0026017641356238645\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07733131159969674,\n \
\ \"acc_stderr\": 0.007357713523222348\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803143\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T16_14_52.979927
path:
- '**/details_harness|drop|3_2023-10-22T16-14-52.979927.parquet'
- split: 2023_10_23T02_48_34.723506
path:
- '**/details_harness|drop|3_2023-10-23T02-48-34.723506.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T02-48-34.723506.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T16_14_52.979927
path:
- '**/details_harness|gsm8k|5_2023-10-22T16-14-52.979927.parquet'
- split: 2023_10_23T02_48_34.723506
path:
- '**/details_harness|gsm8k|5_2023-10-23T02-48-34.723506.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T02-48-34.723506.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:26:58.077469.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:26:58.077469.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:26:58.077469.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T16_14_52.979927
path:
- '**/details_harness|winogrande|5_2023-10-22T16-14-52.979927.parquet'
- split: 2023_10_23T02_48_34.723506
path:
- '**/details_harness|winogrande|5_2023-10-23T02-48-34.723506.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T02-48-34.723506.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_26_58.077469
path:
- results_2023-07-19T18:26:58.077469.parquet
- split: 2023_10_22T16_14_52.979927
path:
- results_2023-10-22T16-14-52.979927.parquet
- split: 2023_10_23T02_48_34.723506
path:
- results_2023-10-23T02-48-34.723506.parquet
- split: latest
path:
- results_2023-10-23T02-48-34.723506.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-13b-gpt4-1.4](https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T02:48:34.723506](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.4/blob/main/results_2023-10-23T02-48-34.723506.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.05285234899328859,
"em_stderr": 0.0022912930700355423,
"f1": 0.11820364932885902,
"f1_stderr": 0.0026017641356238645,
"acc": 0.41988112541310807,
"acc_stderr": 0.009659506214512746
},
"harness|drop|3": {
"em": 0.05285234899328859,
"em_stderr": 0.0022912930700355423,
"f1": 0.11820364932885902,
"f1_stderr": 0.0026017641356238645
},
"harness|gsm8k|5": {
"acc": 0.07733131159969674,
"acc_stderr": 0.007357713523222348
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803143
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Saleh11623/stanfordQuestionAnsweringDataset | ---
task_categories:
- table-question-answering
tags:
- not-for-all-audiences
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/eternity_larva_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of eternity_larva/エタニティラルバ (Touhou)
This is the dataset of eternity_larva/エタニティラルバ (Touhou), containing 500 images and their tags.
The core tags of this character are `butterfly_wings, wings, short_hair, leaf_on_head, aqua_hair, hair_between_eyes, orange_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 591.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eternity_larva_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 347.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eternity_larva_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1109 | 717.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eternity_larva_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 525.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eternity_larva_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1109 | 980.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eternity_larva_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/eternity_larva_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, antennae, blush, fairy, green_dress, leaf, multicolored_dress, open_mouth, short_sleeves, smile, solo, upper_body, yellow_eyes |
| 1 | 9 |  |  |  |  |  | 1girl, antennae, barefoot, fairy, green_dress, leaf, multicolored_dress, short_sleeves, smile, solo, full_body, open_mouth, blush, brown_eyes |
| 2 | 5 |  |  |  |  |  | 1girl, antennae, blush, closed_mouth, fairy, green_dress, leaf, multicolored_dress, short_sleeves, smile, solo, yellow_eyes, feet_out_of_frame |
| 3 | 6 |  |  |  |  |  | 1girl, antennae, closed_mouth, fairy, green_dress, leaf, multicolored_dress, short_sleeves, simple_background, solo, upper_body, white_background, smile, blush, looking_at_viewer, yellow_eyes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | antennae | blush | fairy | green_dress | leaf | multicolored_dress | open_mouth | short_sleeves | smile | solo | upper_body | yellow_eyes | barefoot | full_body | brown_eyes | closed_mouth | feet_out_of_frame | simple_background | white_background | looking_at_viewer |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------|:--------|:--------------|:-------|:---------------------|:-------------|:----------------|:--------|:-------|:-------------|:--------------|:-----------|:------------|:-------------|:---------------|:--------------------|:--------------------|:-------------------|:--------------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | X | X | X | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | X | | X | | | | X | X | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | X | X | X | | | | X | | X | X | X |
|
RAMILISON/rajo | ---
license: apache-2.0
---
|
autoevaluate/autoeval-eval-inverse-scaling__hindsight-neglect-10shot-inverse-scali-383fe9-1695459612 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- inverse-scaling/hindsight-neglect-10shot
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-30b_eval
metrics: []
dataset_name: inverse-scaling/hindsight-neglect-10shot
dataset_config: inverse-scaling--hindsight-neglect-10shot
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-30b_eval
* Dataset: inverse-scaling/hindsight-neglect-10shot
* Config: inverse-scaling--hindsight-neglect-10shot
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@MicPie](https://huggingface.co/MicPie) for evaluating this model. |
SEACrowd/su_id_asr | ---
tags:
- speech-recognition
language:
- sun
---
# su_id_asr
Sundanese ASR training data set containing ~220K utterances.
This dataset was collected by Google in Indonesia.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@inproceedings{sodimana18_sltu,
author={Keshan Sodimana and Pasindu {De Silva} and Supheakmungkol Sarin and Oddur Kjartansson and Martin Jansche and Knot Pipatsrisawat and Linne Ha},
title={{A Step-by-Step Process for Building TTS Voices Using Open Source Data and Frameworks for Bangla, Javanese, Khmer, Nepali, Sinhala, and Sundanese}},
year=2018,
booktitle={Proc. 6th Workshop on Spoken Language Technologies for Under-Resourced Languages (SLTU 2018)},
pages={66--70},
doi={10.21437/SLTU.2018-14}
}
```
## License
Attribution-ShareAlike 4.0 International.
## Homepage
[https://indonlp.github.io/nusa-catalogue/card.html?su_id_asr](https://indonlp.github.io/nusa-catalogue/card.html?su_id_asr)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
rcds/swiss_law_area_prediction | ---
license: cc-by-sa-4.0
annotations_creators:
- machine-generated
language:
- de
- fr
- it
language_creators:
- expert-generated
multilinguality:
- multilingual
pretty_name: Law Area Prediction
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text-classification
---
# Dataset Card for Law Area Prediction
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The dataset contains cases to be classified into the four main areas of law: Public, Civil, Criminal and Social
These can be classified further into sub-areas:
```
"public": ['Tax', 'Urban Planning and Environmental', 'Expropriation', 'Public Administration', 'Other Fiscal'],
"civil": ['Rental and Lease', 'Employment Contract', 'Bankruptcy', 'Family', 'Competition and Antitrust', 'Intellectual Property'],
'criminal': ['Substantive Criminal', 'Criminal Procedure']
```
### Supported Tasks and Leaderboards
Law Area Prediction can be used as text classification task
### Languages
Switzerland has four official languages with three languages German, French and Italian being represenated. The decisions are written by the judges and clerks in the language of the proceedings.
| Language | Subset | Number of Documents|
|------------|------------|--------------------|
| German | **de** | 127K |
| French | **fr** | 156K |
| Italian | **it** | 46K |
## Dataset Structure
- decision_id: unique identifier for the decision
- facts: facts section of the decision
- considerations: considerations section of the decision
- law_area: label of the decision (main area of law)
- law_sub_area: sub area of law of the decision
- language: language of the decision
- year: year of the decision
- court: court of the decision
- chamber: chamber of the decision
- canton: canton of the decision
- region: region of the decision
### Data Fields
[More Information Needed]
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
The dataset was split date-stratisfied
- Train: 2002-2015
- Validation: 2016-2017
- Test: 2018-2022
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
The original data are published from the Swiss Federal Supreme Court (https://www.bger.ch) in unprocessed formats (HTML). The documents were downloaded from the Entscheidsuche portal (https://entscheidsuche.ch) in HTML.
#### Who are the source language producers?
The decisions are written by the judges and clerks in the language of the proceedings.
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
The dataset contains publicly available court decisions from the Swiss Federal Supreme Court. Personal or sensitive information has been anonymized by the court before publication according to the following guidelines: https://www.bger.ch/home/juridiction/anonymisierungsregeln.html.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
We release the data under CC-BY-4.0 which complies with the court licensing (https://www.bger.ch/files/live/sites/bger/files/pdf/de/urteilsveroeffentlichung_d.pdf)
© Swiss Federal Supreme Court, 2002-2022
The copyright for the editorial content of this website and the consolidated texts, which is owned by the Swiss Federal Supreme Court, is licensed under the Creative Commons Attribution 4.0 International licence. This means that you can re-use the content provided you acknowledge the source and indicate any changes you have made.
Source: https://www.bger.ch/files/live/sites/bger/files/pdf/de/urteilsveroeffentlichung_d.pdf
### Citation Information
Please cite our [ArXiv-Preprint](https://arxiv.org/abs/2306.09237)
```
@misc{rasiah2023scale,
title={SCALE: Scaling up the Complexity for Advanced Language Model Evaluation},
author={Vishvaksenan Rasiah and Ronja Stern and Veton Matoshi and Matthias Stürmer and Ilias Chalkidis and Daniel E. Ho and Joel Niklaus},
year={2023},
eprint={2306.09237},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
|
tarekziade/docornot | ---
license: other
dataset_info:
features:
- name: image
dtype: image
- name: is_document
dtype:
class_label:
names:
'0': 'no'
'1': 'yes'
splits:
- name: train
num_bytes: 3747106867.2
num_examples: 12800
- name: test
num_bytes: 468388358.4
num_examples: 1600
- name: validation
num_bytes: 468388358.4
num_examples: 1600
download_size: 4682888903
dataset_size: 4683883584.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
The `DocOrNot` dataset contains 50% of images that are pictures, and 50% that are documents.
It was built using 8k images from each one of these sources:
- RVL CDIP (Small) - https://www.kaggle.com/datasets/uditamin/rvl-cdip-small - license: https://www.industrydocuments.ucsf.edu/help/copyright/
- Flickr8k - https://www.kaggle.com/datasets/adityajn105/flickr8k - license: https://creativecommons.org/publicdomain/zero/1.0/
It can be used to train a model and classify an image as being a picture or a document.
Source code used to generate this dataset : https://github.com/tarekziade/docornot
|
jgwill/gia-young-picasso-v03-201216-var2 | ---
license: creativeml-openrail-m
---
|
shahbajsingh/nyc-taxi-fare-prediction-train | ---
dataset_info:
features:
- name: key
dtype: string
- name: fare_amount
dtype: float64
- name: pickup_datetime
dtype: string
- name: pickup_longitude
dtype: float64
- name: pickup_latitude
dtype: float64
- name: dropoff_longitude
dtype: float64
- name: dropoff_latitude
dtype: float64
- name: passenger_count
dtype: int64
splits:
- name: train
num_bytes: 5926405250
num_examples: 55423856
download_size: 3775003042
dataset_size: 5926405250
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SemRel/SemRel2024 | ---
language:
- afr
- amh
- arb
- arq
- ary
- eng
- es
- hau
- hin
- ind
- kin
- mar
- pan
- tel
dataset_info:
- config_name: afr
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
splits:
- name: test
num_bytes: 65243
num_examples: 375
- name: dev
num_bytes: 66249
num_examples: 375
download_size: 95864
dataset_size: 131492
- config_name: amh
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 209475
num_examples: 992
- name: test
num_bytes: 36637
num_examples: 171
- name: dev
num_bytes: 19498
num_examples: 95
download_size: 153682
dataset_size: 265610
- config_name: arb
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
splits:
- name: test
num_bytes: 110473
num_examples: 595
- name: dev
num_bytes: 5846
num_examples: 32
download_size: 72348
dataset_size: 116319
- config_name: arq
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 170025
num_examples: 1261
- name: test
num_bytes: 79323
num_examples: 583
- name: dev
num_bytes: 12181
num_examples: 97
download_size: 149472
dataset_size: 261529
- config_name: ary
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 382561
num_examples: 924
- name: test
num_bytes: 175568
num_examples: 426
- name: dev
num_bytes: 27975
num_examples: 71
download_size: 274828
dataset_size: 586104
- config_name: eng
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 844975
num_examples: 5500
- name: test
num_bytes: 374647
num_examples: 2600
- name: dev
num_bytes: 36697
num_examples: 250
download_size: 868674
dataset_size: 1256319
- config_name: esp
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 316713
num_examples: 1562
- name: test
num_bytes: 123222
num_examples: 600
- name: dev
num_bytes: 28981
num_examples: 140
download_size: 323584
dataset_size: 468916
- config_name: hau
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 403474
num_examples: 1736
- name: test
num_bytes: 142238
num_examples: 603
- name: dev
num_bytes: 49236
num_examples: 212
download_size: 328542
dataset_size: 594948
- config_name: hin
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
splits:
- name: test
num_bytes: 377385
num_examples: 968
- name: dev
num_bytes: 113047
num_examples: 288
download_size: 217493
dataset_size: 490432
- config_name: ind
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
splits:
- name: test
num_bytes: 68185
num_examples: 360
- name: dev
num_bytes: 26579
num_examples: 144
download_size: 68263
dataset_size: 94764
- config_name: kin
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 234520
num_examples: 778
- name: test
num_bytes: 67211
num_examples: 222
- name: dev
num_bytes: 30758
num_examples: 102
download_size: 219256
dataset_size: 332489
- config_name: mar
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 555224
num_examples: 1155
- name: test
num_bytes: 139343
num_examples: 298
- name: dev
num_bytes: 146496
num_examples: 293
download_size: 381039
dataset_size: 841063
- config_name: pan
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
splits:
- name: test
num_bytes: 307401
num_examples: 634
- name: dev
num_bytes: 117984
num_examples: 242
download_size: 166402
dataset_size: 425385
- config_name: tel
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: float64
splits:
- name: train
num_bytes: 561688
num_examples: 1146
- name: test
num_bytes: 145249
num_examples: 297
- name: dev
num_bytes: 64775
num_examples: 130
download_size: 347275
dataset_size: 771712
configs:
- config_name: afr
data_files:
- split: test
path: afr/test-*
- split: dev
path: afr/dev-*
- config_name: amh
data_files:
- split: train
path: amh/train-*
- split: test
path: amh/test-*
- split: dev
path: amh/dev-*
- config_name: arb
data_files:
- split: test
path: arb/test-*
- split: dev
path: arb/dev-*
- config_name: arq
data_files:
- split: train
path: arq/train-*
- split: test
path: arq/test-*
- split: dev
path: arq/dev-*
- config_name: ary
data_files:
- split: train
path: ary/train-*
- split: test
path: ary/test-*
- split: dev
path: ary/dev-*
- config_name: eng
data_files:
- split: train
path: eng/train-*
- split: test
path: eng/test-*
- split: dev
path: eng/dev-*
- config_name: esp
data_files:
- split: train
path: esp/train-*
- split: test
path: esp/test-*
- split: dev
path: esp/dev-*
- config_name: hau
data_files:
- split: train
path: hau/train-*
- split: test
path: hau/test-*
- split: dev
path: hau/dev-*
- config_name: hin
data_files:
- split: test
path: hin/test-*
- split: dev
path: hin/dev-*
- config_name: ind
data_files:
- split: test
path: ind/test-*
- split: dev
path: ind/dev-*
- config_name: kin
data_files:
- split: train
path: kin/train-*
- split: test
path: kin/test-*
- split: dev
path: kin/dev-*
- config_name: mar
data_files:
- split: train
path: mar/train-*
- split: test
path: mar/test-*
- split: dev
path: mar/dev-*
- config_name: pan
data_files:
- split: test
path: pan/test-*
- split: dev
path: pan/dev-*
- config_name: tel
data_files:
- split: train
path: tel/train-*
- split: test
path: tel/test-*
- split: dev
path: tel/dev-*
task_categories:
- text-classification
- sentence-similarity
---
## Dataset Description
- **Homepage:** https://semantic-textual-relatedness.github.io
- **Repository:** [GitHub](https://github.com/semantic-textual-relatedness/Semantic_Relatedness_SemEval2024)
- **Paper:** [SemRel2024: A Collection of Semantic Textual Relatedness Datasets for 14 Languages](https://arxiv.org/abs/2402.08638)
- **Paper:** [SemEval Task 1: Semantic Textual Relatedness for African and Asian Languages](https://arxiv.org/pdf/2403.18933.pdf)
- **Leaderboard:** https://codalab.lisn.upsaclay.fr/competitions/16799#results
- **Point of Contact:** [Nedjma Ousidhoum](mailto:nedjma.ousidhoum@gmail.com)
### Dataset Summary
SemRel2024 is a collection of Semantic Textual Relatedness (STR) datasets for 14 languages, including African and Asian languages. The datasets are composed of sentence pairs, each assigned a relatedness score between 0 (completely) unrelated and 1 (maximally related) with a large range of expected relatedness values.
SemRel2024 dataset was used as part of the SemEval2024 shared task 1. The task aims to evaluate the ability of systems to measure the semantic relatedness between two sentences.
### Languages
The SemRel2024 dataset covers the following 14 languages:
1. Afrikaans (_afr_)
2. Algerian Arabic (_arq_)
3. Amharic (_amh_)
4. English (_eng_)
5. Hausa (_hau_)
6. Indonesian (_ind_)
7. Hindi (_hin_)
8. Kinyarwanda (_kin_)
9. Marathi (_mar_)
10. Modern Standard Arabic (_arb_)
11. Moroccan Arabic (_ary_)
12. Punjabi (_pan_)
13. Spanish (_esp_)
14. Telugu (_tel_)
**Note**: Spanish test labels are all -1 because the Spanish team retained the gold test labels to avoid contamination problems in future benchmarking. We refer to the [CodaLab contest website](https://codalab.lisn.upsaclay.fr/competitions/15715) to evaluate your predictions, which will remain open.
## Dataset Structure
### Data Instances
Each instance in the dataset consists of two text segments and a relatedness score indicating the degree of semantic relatedness between them.
```
{
"sentence1": "string",
"sentence2": "string",
"label": float
}
```
- sentence1: a string feature representing the first text segment.
- sentence2: a string feature representing the second text segment.
- label: a float value representing the semantic relatedness score between sentence1 and sentence2, typically ranging from 0 (not related at all) to 1 (highly related).
## Citation Information
If you use the SemRel2024 dataset in your research, please cite the following papers:
```
@misc{ousidhoum2024semrel2024,
title={SemRel2024: A Collection of Semantic Textual Relatedness Datasets for 14 Languages},
author={Nedjma Ousidhoum and Shamsuddeen Hassan Muhammad and Mohamed Abdalla and Idris Abdulmumin and Ibrahim Said Ahmad and
Sanchit Ahuja and Alham Fikri Aji and Vladimir Araujo and Abinew Ali Ayele and Pavan Baswani and Meriem Beloucif and
Chris Biemann and Sofia Bourhim and Christine De Kock and Genet Shanko Dekebo and
Oumaima Hourrane and Gopichand Kanumolu and Lokesh Madasu and Samuel Rutunda and Manish Shrivastava and
Thamar Solorio and Nirmal Surange and Hailegnaw Getaneh Tilaye and Krishnapriya Vishnubhotla and Genta Winata and
Seid Muhie Yimam and Saif M. Mohammad},
year={2024},
eprint={2402.08638},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```
@inproceedings{ousidhoum-etal-2024-semeval,
title = "{S}em{E}val-2024 Task 1: Semantic Textual Relatedness for African and Asian Languages",
author = "Ousidhoum, Nedjma and Muhammad, Shamsuddeen Hassan and Abdalla, Mohamed and Abdulmumin, Idris and
Ahmad,Ibrahim Said and Ahuja, Sanchit and Aji, Alham Fikri and Araujo, Vladimir and Beloucif, Meriem and
De Kock, Christine and Hourrane, Oumaima and Shrivastava, Manish and Solorio, Thamar and Surange, Nirmal and
Vishnubhotla, Krishnapriya and Yimam, Seid Muhie and Mohammad, Saif M.",
booktitle = "Proceedings of the 18th International Workshop on Semantic Evaluation (SemEval-2024)",
year = "2024",
publisher = "Association for Computational Linguistics"
}
```
|
Softage-AI/multilingual-audio_prompts | ---
license: mit
language:
- en
- hi
- gu
- pa
- as
- ur
- bn
---
# Multilingual Speech Dataset
## Description
This dataset contains 40 voice prompts in different Indian languages. Each record links text prompts to their corresponding audio recordings by native speakers.
## Data attributes
- Language: Assamese, Hindi, Urdu, Gujarati, Bengali, Punjabi
- Prompt: Text of the prompt in the corresponding language (string)
- Audio Path: Link to the audio recording of the prompt in the corresponding language.
## Dataset Source
This dataset is curated by the delivery team @SoftAge
## Limitations and Biases
- The dataset size might not signify the full diversity of languages or prompts.
- The source of the data might contain biases in the vocabulary, phrasing, or cultural references used in the prompts.
- The audio recordings might represent different accents or dialects within each language.
## Potential Uses
- Training multilingual speech recognition and generation models.
- Evaluating the performance of speech processing systems across different languages.
|
asgaardlab/GamePhysics-FullResolution | ---
dataset_info:
features:
- name: id
dtype: string
- name: game
dtype: string
- name: filepath
dtype: string
- name: filename
dtype: string
- name: archive
dtype: string
- name: reddit_url
dtype: string
splits:
- name: validation
num_bytes: 3692759
num_examples: 26954
download_size: 1232477
dataset_size: 3692759
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
license: creativeml-openrail-m
task_categories:
- video-classification
language:
- en
tags:
- video-game
- game
- video-understanding
- ood
- vidoe-ood
pretty_name: GamePhysics
size_categories:
- 10K<n<100K
---
# GamePhysics Dataset
[](https://asgaardlab.github.io/CLIPxGamePhysics/)
[](https://arxiv.org/abs/2203.11096)
[](https://huggingface.co/spaces/taesiri/CLIPxGamePhysics)
The GamePhysics dataset is a collection of gameplay bug videos sourced from the [GamePhysics subreddit](https://www.reddit.com/r/GamePhysics/).
## Sample videos
<video src="https://asgaardlab.github.io/CLIPxGamePhysics/static/videos/9rqabp.mp4" controls="controls" muted="muted" playsinline="playsinline" width=480></video>
<video src="https://asgaardlab.github.io/CLIPxGamePhysics/static/videos/g5pm35.mp4" controls="controls" muted="muted" playsinline="playsinline" width=480></video>
<video src="https://asgaardlab.github.io/CLIPxGamePhysics/static/videos/6xplqg.mp4" controls="controls" muted="muted" playsinline="playsinline" width=480></video>
<video src="https://asgaardlab.github.io/CLIPxGamePhysics/static/videos/4jirzj.mp4" controls="controls" muted="muted" playsinline="playsinline" width=480></video> |
Toadoum/Ngambay-French-bitext-dataset | ---
license: apache-2.0
---
|
jarod0411/cancer | ---
dataset_info:
features:
- name: smiles
dtype: string
- name: scaffold_smiles
dtype: string
- name: selfies
dtype: string
- name: scaffold_selfies
dtype: string
- name: QED
dtype: float64
- name: DockingScore
dtype: float64
splits:
- name: train
num_bytes: 579751946
num_examples: 1253132
- name: test
num_bytes: 64320903
num_examples: 139222
download_size: 225971277
dataset_size: 644072849
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
TigerZheng/PFCdata | ---
license: mit
---
|
VaggP/style_transfer_paintings_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': generated
'1': original
splits:
- name: train
num_bytes: 6904526021.588
num_examples: 4913
- name: test
num_bytes: 2137893838.395
num_examples: 1235
download_size: 10900941346
dataset_size: 9042419859.983
---
# Dataset Card for "style_transfer_paintings_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
grammarly/pseudonymization-data | ---
license: apache-2.0
task_categories:
- text-classification
- summarization
language:
- en
pretty_name: Pseudonymization data
size_categories:
- 100M<n<1T
---
This repository contains all the datasets used in our paper "Privacy- and Utility-Preserving NLP with Anonymized data: A case study of Pseudonymization" (https://aclanthology.org/2023.trustnlp-1.20).
# Dataset Card for Pseudonymization data
## Dataset Description
- **Homepage:** https://huggingface.co/datasets/grammarly/pseudonymization-data
- **Paper:** https://aclanthology.org/2023.trustnlp-1.20/
- **Point of Contact:** oleksandr.yermilov@ucu.edu.ua
### Dataset Summary
This dataset repository contains all the datasets, used in our paper. It includes datasets for different NLP tasks, pseudonymized by different algorithms; a dataset for training Seq2Seq model which translates text from original to "pseudonymized"; and a dataset for training model which would detect if the text was pseudonymized.
### Languages
English.
## Dataset Structure
Each folder contains preprocessed train versions of different datasets (e.g, in the `cnn_dm` folder there will be preprocessed CNN/Daily Mail dataset). Each file has a name, which corresponds with the algorithm from the paper used for its preprocessing (e.g. `ner_ps_spacy_imdb.csv` is imdb dataset, preprocessed with NER-based pseudonymization using FLAIR system).
I
## Dataset Creation
Datasets in `imdb` and `cnn_dm` folders were created by pseudonymizing corresponding datasets with different pseudonymization algorithms.
Datasets in `detection` folder are combined original datasets and pseudonymized datasets, grouped by pseudonymization algorithm used.
Datasets in `seq2seq` folder are datasets for training Seq2Seq transformer-based pseudonymization model. At first, a dataset was fetched from Wikipedia articles, which was preprocessed with either NER-PS<sub>FLAIR</sub> or NER-PS<sub>spaCy</sub> algorithms.
### Personal and Sensitive Information
This datasets bring no sensitive or personal information; it is completely based on data present in open sources (Wikipedia, standard datasets for NLP tasks).
## Considerations for Using the Data
### Known Limitations
Only English texts are present in the datasets. Only a limited part of named entity types are replaced in the datasets. Please, also check the Limitations section of our paper.
## Additional Information
### Dataset Curators
Oleksandr Yermilov (oleksandr.yermilov@ucu.edu.ua)
### Citation Information
```
@inproceedings{yermilov-etal-2023-privacy,
title = "Privacy- and Utility-Preserving {NLP} with Anonymized data: A case study of Pseudonymization",
author = "Yermilov, Oleksandr and
Raheja, Vipul and
Chernodub, Artem",
booktitle = "Proceedings of the 3rd Workshop on Trustworthy Natural Language Processing (TrustNLP 2023)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.trustnlp-1.20",
doi = "10.18653/v1/2023.trustnlp-1.20",
pages = "232--241",
abstract = "This work investigates the effectiveness of different pseudonymization techniques, ranging from rule-based substitutions to using pre-trained Large Language Models (LLMs), on a variety of datasets and models used for two widely used NLP tasks: text classification and summarization. Our work provides crucial insights into the gaps between original and anonymized data (focusing on the pseudonymization technique) and model quality and fosters future research into higher-quality anonymization techniques better to balance the trade-offs between data protection and utility preservation. We make our code, pseudonymized datasets, and downstream models publicly available.",
}
``` |
Varun1808/new_dataset_finetune1 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 13245
num_examples: 55
download_size: 4687
dataset_size: 13245
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "new_dataset_finetune1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CultriX/MsitralTrix-test-dpo | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
tags:
- chemistry
- biology
- dpo
- medical
pretty_name: MistralTrix-test-dpo
size_categories:
- n<1K
--- |
Saxo/OpenOrca_cleaned_kor_linkbricks_single_dataset_with_prompt_text_huggingface | ---
license: apache-2.0
---
|
Khavee/Khavee-klon | ---
license: mit
---
|
one-sec-cv12/chunk_108 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 23229785088.0
num_examples: 241856
download_size: 20977329323
dataset_size: 23229785088.0
---
# Dataset Card for "chunk_108"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_SebastianSchramm__Cerebras-GPT-111M-instruction | ---
pretty_name: Evaluation run of SebastianSchramm/Cerebras-GPT-111M-instruction
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SebastianSchramm/Cerebras-GPT-111M-instruction](https://huggingface.co/SebastianSchramm/Cerebras-GPT-111M-instruction)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SebastianSchramm__Cerebras-GPT-111M-instruction\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T16:31:53.265956](https://huggingface.co/datasets/open-llm-leaderboard/details_SebastianSchramm__Cerebras-GPT-111M-instruction/blob/main/results_2023-10-24T16-31-53.265956.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00010486577181208053,\n\
\ \"em_stderr\": 0.00010486577181208799,\n \"f1\": 0.0016642197986577185,\n\
\ \"f1_stderr\": 0.00029156266897188764,\n \"acc\": 0.2580899763220205,\n\
\ \"acc_stderr\": 0.007022563065489298\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.00010486577181208053,\n \"em_stderr\": 0.00010486577181208799,\n\
\ \"f1\": 0.0016642197986577185,\n \"f1_stderr\": 0.00029156266897188764\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.516179952644041,\n\
\ \"acc_stderr\": 0.014045126130978596\n }\n}\n```"
repo_url: https://huggingface.co/SebastianSchramm/Cerebras-GPT-111M-instruction
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T16_31_53.265956
path:
- '**/details_harness|drop|3_2023-10-24T16-31-53.265956.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T16-31-53.265956.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T16_31_53.265956
path:
- '**/details_harness|gsm8k|5_2023-10-24T16-31-53.265956.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T16-31-53.265956.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:50:00.639660.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:50:00.639660.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T13:50:00.639660.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T16_31_53.265956
path:
- '**/details_harness|winogrande|5_2023-10-24T16-31-53.265956.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T16-31-53.265956.parquet'
- config_name: results
data_files:
- split: 2023_07_19T13_50_00.639660
path:
- results_2023-07-19T13:50:00.639660.parquet
- split: 2023_10_24T16_31_53.265956
path:
- results_2023-10-24T16-31-53.265956.parquet
- split: latest
path:
- results_2023-10-24T16-31-53.265956.parquet
---
# Dataset Card for Evaluation run of SebastianSchramm/Cerebras-GPT-111M-instruction
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/SebastianSchramm/Cerebras-GPT-111M-instruction
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [SebastianSchramm/Cerebras-GPT-111M-instruction](https://huggingface.co/SebastianSchramm/Cerebras-GPT-111M-instruction) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SebastianSchramm__Cerebras-GPT-111M-instruction",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T16:31:53.265956](https://huggingface.co/datasets/open-llm-leaderboard/details_SebastianSchramm__Cerebras-GPT-111M-instruction/blob/main/results_2023-10-24T16-31-53.265956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.00010486577181208053,
"em_stderr": 0.00010486577181208799,
"f1": 0.0016642197986577185,
"f1_stderr": 0.00029156266897188764,
"acc": 0.2580899763220205,
"acc_stderr": 0.007022563065489298
},
"harness|drop|3": {
"em": 0.00010486577181208053,
"em_stderr": 0.00010486577181208799,
"f1": 0.0016642197986577185,
"f1_stderr": 0.00029156266897188764
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.516179952644041,
"acc_stderr": 0.014045126130978596
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
imodels/tabular-classification-of-prompted-llms | ---
license: apache-2.0
---
|
somosnlp/dataset-cultura-guarani_corpus-it | ---
license: cc-by-sa-4.0
dataset_info:
- config_name: default
features:
- name: id
dtype: int64
- name: referencias
dtype: string
- name: preguntas
dtype: string
- name: respuestas
dtype: string
- name: etiquetas
dtype: string
- name: pais
dtype: string
- name: idioma
dtype: string
- name: periodo
dtype: string
splits:
- name: test
num_bytes: 47162
num_examples: 125
- name: train
num_bytes: 511699
num_examples: 1373
download_size: 192461
dataset_size: 558861
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
---
## Descripción
Dataset exclusivo, corregido y centrado en preguntas y respuestas sobre la cultura guarani, tomando como texto base el libro "Ñande Ypykuéra" de Narciso R. Colmán,
asegurando respuestas precisas y culturalmente relevantes.
## Objetivo
Contar con un corpus de instrucciones de 1000 preguntas y respuestas de alta calidad. El libro cuenta con 26 capítulos, cada uno tiene una longitud de texto diferente.
Entonces, la cantidad de preguntas y respuestas por capítulo se tomó de forma proporcional a la longitud de texto en cada capítulo. Gracias a este criterio, se pudo
extraer información justa según cada capítulo.
Además de las preguntas y respuestas simple/directas, se agregaron resúmenes, extraccion de ideas y personajes. Fomentando la comprensión lectora y el análisis crítico,
al modelo.
## Creditos
Desarrollador:
- Enrique Paiva
Anotadores y revisores:
- Daniel Cabrera
- Leticia Bogado
- Alberto Benítez
- Emmanuel
|
joey234/mmlu-marketing-original-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 18898
num_examples: 40
download_size: 15795
dataset_size: 18898
---
# Dataset Card for "mmlu-marketing-original-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_AbacusResearch__jaLLAbi | ---
pretty_name: Evaluation run of AbacusResearch/jaLLAbi
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AbacusResearch/jaLLAbi](https://huggingface.co/AbacusResearch/jaLLAbi) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AbacusResearch__jaLLAbi\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-13T12:32:57.211662](https://huggingface.co/datasets/open-llm-leaderboard/details_AbacusResearch__jaLLAbi/blob/main/results_2024-02-13T12-32-57.211662.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n\
\ \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n\
\ \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \
\ \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n\
\ },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n\
\ \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n\
\ \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\"\
: {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n\
\ \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n\
\ },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n\
\ \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n\
\ \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n\
\ \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n\
\ \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n\
\ \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n\
\ \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n\
\ \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n\
\ \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"\
acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n\
\ \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"\
acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n\
\ \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n\
\ }\n}\n```"
repo_url: https://huggingface.co/AbacusResearch/jaLLAbi
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|arc:challenge|25_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|gsm8k|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hellaswag|10_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T12-32-57.211662.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T12-32-57.211662.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- '**/details_harness|winogrande|5_2024-02-13T12-32-57.211662.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-13T12-32-57.211662.parquet'
- config_name: results
data_files:
- split: 2024_02_13T12_32_57.211662
path:
- results_2024-02-13T12-32-57.211662.parquet
- split: latest
path:
- results_2024-02-13T12-32-57.211662.parquet
---
# Dataset Card for Evaluation run of AbacusResearch/jaLLAbi
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AbacusResearch/jaLLAbi](https://huggingface.co/AbacusResearch/jaLLAbi) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AbacusResearch__jaLLAbi",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T12:32:57.211662](https://huggingface.co/datasets/open-llm-leaderboard/details_AbacusResearch__jaLLAbi/blob/main/results_2024-02-13T12-32-57.211662.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23196194129343728,
"acc_stderr": 0.029934654752561563,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lukarape/acoustic_erebuni_7h | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: phone
dtype: string
- name: id
dtype: string
- name: department
dtype: string
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 1648829831.871
num_examples: 2883
download_size: 2124522004
dataset_size: 1648829831.871
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
id_newspapers_2018 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- id
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
paperswithcode_id: null
pretty_name: Indonesian Newspapers 2018
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: date
dtype: string
- name: title
dtype: string
- name: content
dtype: string
config_name: id_newspapers_2018
splits:
- name: train
num_bytes: 1116031922
num_examples: 499164
download_size: 446018349
dataset_size: 1116031922
---
# Dataset Card for Indonesian Newspapers 2018
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Indonesian Newspapers](https://github.com/feryandi/Dataset-Artikel)
- **Repository:** [Indonesian Newspapers](https://github.com/feryandi/Dataset-Artikel)
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** [feryandi.n@gmail.com](mailto:feryandi.n@gmail.com),
[cahya.wirawan@gmail.com](mailto:cahya.wirawan@gmail.com)
### Dataset Summary
The dataset contains around 500K articles (136M of words) from 7 Indonesian newspapers: Detik, Kompas, Tempo,
CNN Indonesia, Sindo, Republika and Poskota. The articles are dated between 1st January 2018 and 20th August 2018
(with few exceptions dated earlier). The size of uncompressed 500K json files (newspapers-json.tgz) is around 2.2GB,
and the cleaned uncompressed in a big text file (newspapers.txt.gz) is about 1GB. The original source in Google Drive
contains also a dataset in html format which include raw data (pictures, css, javascript, ...)
from the online news website. A copy of the original dataset is available at
https://cloud.uncool.ai/index.php/s/mfYEAgKQoY3ebbM
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Indonesian
## Dataset Structure
```
{
'id': 'string',
'url': 'string',
'date': 'string',
'title': 'string',
'content': 'string'
}
```
### Data Instances
An instance from the dataset is
```
{'id': '0',
'url': 'https://www.cnnindonesia.com/olahraga/20161221234219-156-181385/lorenzo-ingin-samai-rekor-rossi-dan-stoner',
'date': '2016-12-22 07:00:00',
'title': 'Lorenzo Ingin Samai Rekor Rossi dan Stoner',
'content': 'Jakarta, CNN Indonesia -- Setelah bergabung dengan Ducati, Jorge Lorenzo berharap bisa masuk dalam jajaran pebalap yang mampu jadi juara dunia kelas utama dengan dua pabrikan berbeda. Pujian Max Biaggi untuk Valentino Rossi Jorge Lorenzo Hadir dalam Ucapan Selamat Natal Yamaha Iannone: Saya Sering Jatuh Karena Ingin yang Terbaik Sepanjang sejarah, hanya ada lima pebalap yang mampu jadi juara kelas utama (500cc/MotoGP) dengan dua pabrikan berbeda, yaitu Geoff Duke, Giacomo Agostini, Eddie Lawson, Valentino Rossi, dan Casey Stoner. Lorenzo ingin bergabung dalam jajaran legenda tersebut. “Fakta ini sangat penting bagi saya karena hanya ada lima pebalap yang mampu menang dengan dua pabrikan berbeda dalam sejarah balap motor.” “Kedatangan saya ke Ducati juga menghadirkan tantangan yang sangat menarik karena hampir tak ada yang bisa menang dengan Ducati sebelumnya, kecuali Casey Stoner. Hal itu jadi motivasi yang sangat bagus bagi saya,” tutur Lorenzo seperti dikutip dari Crash Lorenzo saat ini diliputi rasa penasaran yang besar untuk menunggang sepeda motor Desmosedici yang dipakai tim Ducati karena ia baru sekali menjajal motor tersebut pada sesi tes di Valencia, usai MotoGP musim 2016 berakhir. “Saya sangat tertarik dengan Ducati arena saya hanya memiliki kesempatan mencoba motor itu di Valencia dua hari setelah musim berakhir. Setelah itu saya tak boleh lagi menjajalnya hingga akhir Januari mendatang. Jadi saya menjalani penantian selama dua bulan yang panjang,” kata pebalap asal Spanyol ini. Dengan kondisi tersebut, maka Lorenzo memanfaatkan waktu yang ada untuk liburan dan melepaskan penat. “Setidaknya apa yang terjadi pada saya saat ini sangat bagus karena saya jadi memiliki waktu bebas dan sedikit liburan.” “Namun tentunya saya tak akan larut dalam liburan karena saya harus lebih bersiap, terutama dalam kondisi fisik dibandingkan sebelumnya, karena saya akan menunggangi motor yang sulit dikendarai,” ucap Lorenzo. Selama sembilan musim bersama Yamaha, Lorenzo sendiri sudah tiga kali jadi juara dunia, yaitu pada 2010, 2012, dan 2015. (kid)'}
```
### Data Fields
- `id`: id of the sample
- `url`: the url to the original article
- `date`: the publishing date of the article
- `title`: the title of the article
- `content`: the content of the article
### Data Splits
The dataset contains train set of 499164 samples.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. The dataset is shared for the sole purpose of aiding open scientific research in Bahasa Indonesia (computing or linguistics), and can only be used for that purpose. The ownership of each article within the dataset belongs to the respective newspaper from which it was extracted; and the maintainer of the repository does not claim ownership of any of the content within it. If you think, by any means, that this dataset breaches any established copyrights; please contact the repository maintainer.
### Citation Information
[N/A]
### Contributions
Thanks to [@cahya-wirawan](https://github.com/cahya-wirawan) for adding this dataset. |
lshowway/wikipedia.reorder.vso.fr | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 886603410
num_examples: 490371
download_size: 404136391
dataset_size: 886603410
---
# Dataset Card for "wikipedia.reorder.vso.fr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bys2058/sd1111 | ---
dataset_info:
features:
- name: image
dtype: image
- name: hair_mask
dtype: image
- name: result_image
dtype: image
- name: image_caption
dtype: string
splits:
- name: train
num_bytes: 93744509413.828
num_examples: 54062
download_size: 91885409225
dataset_size: 93744509413.828
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sd1111"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nunofofo/rr | ---
license: openrail
---
|
CyberHarem/maihama_ayumu_theidolmstermillionlive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of maihama_ayumu/舞浜歩 (THE iDOLM@STER: Million Live!)
This is the dataset of maihama_ayumu/舞浜歩 (THE iDOLM@STER: Million Live!), containing 125 images and their tags.
The core tags of this character are `pink_hair, multicolored_hair, pink_eyes, ponytail, long_hair, blonde_hair, streaked_hair, breasts, bangs, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 125 | 113.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maihama_ayumu_theidolmstermillionlive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 125 | 86.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maihama_ayumu_theidolmstermillionlive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 277 | 167.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maihama_ayumu_theidolmstermillionlive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 125 | 108.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maihama_ayumu_theidolmstermillionlive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 277 | 204.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maihama_ayumu_theidolmstermillionlive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/maihama_ayumu_theidolmstermillionlive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, smile, solo, looking_at_viewer, midriff, navel, necklace, bracelet, open_mouth, crop_top, belt, cleavage, earrings, medium_breasts, one_eye_closed, fingerless_gloves, jacket, pants |
| 1 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, collarbone, simple_background, upper_body, smile, bare_shoulders, two-tone_hair, grey_background, open_mouth, sleeveless_shirt, white_shirt, closed_mouth, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | solo | looking_at_viewer | midriff | navel | necklace | bracelet | open_mouth | crop_top | belt | cleavage | earrings | medium_breasts | one_eye_closed | fingerless_gloves | jacket | pants | blush | collarbone | simple_background | upper_body | bare_shoulders | two-tone_hair | grey_background | sleeveless_shirt | white_shirt | closed_mouth | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------------------|:----------|:--------|:-----------|:-----------|:-------------|:-----------|:-------|:-----------|:-----------|:-----------------|:-----------------|:--------------------|:---------|:--------|:--------|:-------------|:--------------------|:-------------|:-----------------|:----------------|:------------------|:-------------------|:--------------|:---------------|:-------------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
CVasNLPExperiments/StanfordCars_test_google_flan_t5_xl_mode_T_SPECIFIC_A_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 537329
num_examples: 1000
download_size: 118758
dataset_size: 537329
---
# Dataset Card for "StanfordCars_test_google_flan_t5_xl_mode_T_SPECIFIC_A_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tasksource/monotonicity-entailment | ---
license: apache-2.0
---
```
@inproceedings{yanaka-etal-2019-neural,
title = "Can Neural Networks Understand Monotonicity Reasoning?",
author = "Yanaka, Hitomi and
Mineshima, Koji and
Bekki, Daisuke and
Inui, Kentaro and
Sekine, Satoshi and
Abzianidze, Lasha and
Bos, Johan",
booktitle = "Proceedings of the 2019 ACL Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP",
year = "2019",
pages = "31--40",
}
```
|
blattimer/ScreenEval | ---
license: mit
viewer: false
---
# ScreenEval
contact: Barrett Lattimer, blattimer@asapp.com \
paper: [Fast and Accurate Factual Inconsistency Detection Over Long Documents](https://arxiv.org/abs/2310.13189) \
github: [scale-score](https://github.com/asappresearch/scale-score)
ScreenEval is a a novel dataset designed for factual inconsistency detection in long dialogues.
52 TV transcripts were summarized by humans, Longformer, and GPT-4, then each summary sentence was labelled for factual consistency with the source TV transcript.
Additionally, if a summary sentence was factually consistent, labellers provided relevant utterance support in the source document.
ScreenEval is the longest dialogue based dataset by tokens for factual inconsistency detection available to date.
The dialogue domain presents unique challenges such as long-distance coreference resolution and significant noise between relevant utterances.
ScreenEval is the dataset proposed in the paper "Fast and Accurate Factual Inconsistency Detection Over Long Documents" from EMNLP2023.
## Stats at a glance
- 52 TV transcripts
- \>6k tokens per TV transcript
- 624 summary sentences in total (from humans, Longformer, and GPT-4)
- Relevant utterance labels for all factually consistent summary sentences
## Arguments
The following keys can be used to access the different part of the ScreenEval dataset.
| Key | Type | Description |
| ------ | ------ | ------ |
| original_convo | List[str] | The source document that is to be summarized as a string |
| convo | List[List[str]] | The source document that is to be summarized split into a list of utterances |
| inferred_summary | List[str] | The summary sentence that is paired with the given source document |
| summary_id | List[str] | The source model for the summary sentence |
| convo_id | List[int] | The ID of the source document |
| annotated_summary | List[str] | The entire associated summary, with the focus summary sentence surrounded by `<mark><\mark>`|
| prediction_annotated_source_doc | List[str] | Raw source document |
| agreement | List[float] | Annotator agreement on summary sentence facutal inconsistency label |
| agg_label | List[bool] | Factual inconsistency label (true -> factually consistent, false -> factually inconsistent) |
| rel_utt | List[List[int]] | The indices of related utterances in the corresponding `convo` list. |
marcos292/LuizCarvalho | ---
license: openrail
---
|
englert-m/reconstruction | ---
dataset_info:
features:
- name: orig
dtype: int32
- name: corrupted
dtype: image
- name: count
dtype: int32
- name: xflip
dtype: int64
- name: yflip
dtype: int64
- name: scale
dtype: float32
- name: rotate_frac
dtype: float32
- name: aniso_w
dtype: float32
- name: aniso_r
dtype: float32
- name: translate_frac
sequence: float32
splits:
- name: train
num_bytes: 147813503004.625
num_examples: 59583403
download_size: 155980537726
dataset_size: 147813503004.625
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "reconstruction"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jing24/sorted_generate_sub_4 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: conf
dtype: float32
splits:
- name: train
num_bytes: 42698509
num_examples: 46640
download_size: 7864808
dataset_size: 42698509
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sorted_generate_sub_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
clem/prompts | ---
license: apache-2.0
---
This is my collection of prompts to increase my productivity as a co-founder and CEO at Hugging Face |
sue-ai-taos/WildCard_ColorPreset | ---
license: unlicense
---
|
open-llm-leaderboard/details_KnutJaegersberg__Walter-Falcon-1B | ---
pretty_name: Evaluation run of KnutJaegersberg/Walter-Falcon-1B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/Walter-Falcon-1B](https://huggingface.co/KnutJaegersberg/Walter-Falcon-1B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Walter-Falcon-1B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-10T12:28:40.127971](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Walter-Falcon-1B/blob/main/results_2023-12-10T12-28-40.127971.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2504304926123702,\n\
\ \"acc_stderr\": 0.03055596076992834,\n \"acc_norm\": 0.2520875598433206,\n\
\ \"acc_norm_stderr\": 0.031361161079445435,\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080515,\n \"mc2\": 0.38469934472881057,\n\
\ \"mc2_stderr\": 0.014966198091063187\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.28498293515358364,\n \"acc_stderr\": 0.013191348179838793,\n\
\ \"acc_norm\": 0.310580204778157,\n \"acc_norm_stderr\": 0.013522292098053054\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.42381995618402707,\n\
\ \"acc_stderr\": 0.0049315259610357536,\n \"acc_norm\": 0.5491933877713603,\n\
\ \"acc_norm_stderr\": 0.004965572246803867\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.03749850709174022,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.03749850709174022\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.16447368421052633,\n \"acc_stderr\": 0.030167533468632688,\n\
\ \"acc_norm\": 0.16447368421052633,\n \"acc_norm_stderr\": 0.030167533468632688\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.02619980880756191,\n\
\ \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.02619980880756191\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.037161774375660185,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.037161774375660185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.15,\n\
\ \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.15,\n \
\ \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3021276595744681,\n \"acc_stderr\": 0.030017554471880557,\n\
\ \"acc_norm\": 0.3021276595744681,\n \"acc_norm_stderr\": 0.030017554471880557\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22903225806451613,\n\
\ \"acc_stderr\": 0.023904914311782644,\n \"acc_norm\": 0.22903225806451613,\n\
\ \"acc_norm_stderr\": 0.023904914311782644\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114485,\n\
\ \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114485\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.16666666666666666,\n \"acc_stderr\": 0.026552207828215286,\n \"\
acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.026552207828215286\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.17098445595854922,\n \"acc_stderr\": 0.02717121368316453,\n\
\ \"acc_norm\": 0.17098445595854922,\n \"acc_norm_stderr\": 0.02717121368316453\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128002,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128002\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22592592592592592,\n \"acc_stderr\": 0.025497532639609542,\n \
\ \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.025497532639609542\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.027553614467863804,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.027553614467863804\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.2036697247706422,\n \"acc_stderr\": 0.0172667420876308,\n \"acc_norm\"\
: 0.2036697247706422,\n \"acc_norm_stderr\": 0.0172667420876308\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.027920963147993666,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.027920963147993666\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.03096451792692341,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.03096451792692341\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.29535864978902954,\n \"acc_stderr\": 0.029696338713422882,\n \
\ \"acc_norm\": 0.29535864978902954,\n \"acc_norm_stderr\": 0.029696338713422882\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.242152466367713,\n\
\ \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.242152466367713,\n\
\ \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847836,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847836\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2809917355371901,\n \"acc_stderr\": 0.041032038305145124,\n \"\
acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.041032038305145124\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.044328040552915206,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.044328040552915206\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n\
\ \"acc_stderr\": 0.029872577708891155,\n \"acc_norm\": 0.2948717948717949,\n\
\ \"acc_norm_stderr\": 0.029872577708891155\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2822477650063857,\n\
\ \"acc_stderr\": 0.016095302969878565,\n \"acc_norm\": 0.2822477650063857,\n\
\ \"acc_norm_stderr\": 0.016095302969878565\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824775,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824775\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2282958199356913,\n\
\ \"acc_stderr\": 0.0238393033113982,\n \"acc_norm\": 0.2282958199356913,\n\
\ \"acc_norm_stderr\": 0.0238393033113982\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.0230167056402622,\n\
\ \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.0230167056402622\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843007,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843007\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n\
\ \"acc_stderr\": 0.011015752255279336,\n \"acc_norm\": 0.2470664928292047,\n\
\ \"acc_norm_stderr\": 0.011015752255279336\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24019607843137256,\n \"acc_stderr\": 0.01728276069516742,\n \
\ \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.01728276069516742\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.03831305140884601,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.03831305140884601\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24489795918367346,\n \"acc_stderr\": 0.02752963744017492,\n\
\ \"acc_norm\": 0.24489795918367346,\n \"acc_norm_stderr\": 0.02752963744017492\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.26865671641791045,\n\
\ \"acc_stderr\": 0.031343283582089536,\n \"acc_norm\": 0.26865671641791045,\n\
\ \"acc_norm_stderr\": 0.031343283582089536\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553026,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553026\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.25146198830409355,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.25146198830409355,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080515,\n \"mc2\": 0.38469934472881057,\n\
\ \"mc2_stderr\": 0.014966198091063187\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5540647198105761,\n \"acc_stderr\": 0.01397009348233069\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/Walter-Falcon-1B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|arc:challenge|25_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|gsm8k|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hellaswag|10_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T12-28-40.127971.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T12-28-40.127971.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- '**/details_harness|winogrande|5_2023-12-10T12-28-40.127971.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-10T12-28-40.127971.parquet'
- config_name: results
data_files:
- split: 2023_12_10T12_28_40.127971
path:
- results_2023-12-10T12-28-40.127971.parquet
- split: latest
path:
- results_2023-12-10T12-28-40.127971.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/Walter-Falcon-1B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/Walter-Falcon-1B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Walter-Falcon-1B](https://huggingface.co/KnutJaegersberg/Walter-Falcon-1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Walter-Falcon-1B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T12:28:40.127971](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Walter-Falcon-1B/blob/main/results_2023-12-10T12-28-40.127971.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2504304926123702,
"acc_stderr": 0.03055596076992834,
"acc_norm": 0.2520875598433206,
"acc_norm_stderr": 0.031361161079445435,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080515,
"mc2": 0.38469934472881057,
"mc2_stderr": 0.014966198091063187
},
"harness|arc:challenge|25": {
"acc": 0.28498293515358364,
"acc_stderr": 0.013191348179838793,
"acc_norm": 0.310580204778157,
"acc_norm_stderr": 0.013522292098053054
},
"harness|hellaswag|10": {
"acc": 0.42381995618402707,
"acc_stderr": 0.0049315259610357536,
"acc_norm": 0.5491933877713603,
"acc_norm_stderr": 0.004965572246803867
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174022,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174022
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.16447368421052633,
"acc_stderr": 0.030167533468632688,
"acc_norm": 0.16447368421052633,
"acc_norm_stderr": 0.030167533468632688
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23773584905660378,
"acc_stderr": 0.02619980880756191,
"acc_norm": 0.23773584905660378,
"acc_norm_stderr": 0.02619980880756191
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.037161774375660185,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.037161774375660185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3021276595744681,
"acc_stderr": 0.030017554471880557,
"acc_norm": 0.3021276595744681,
"acc_norm_stderr": 0.030017554471880557
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22903225806451613,
"acc_stderr": 0.023904914311782644,
"acc_norm": 0.22903225806451613,
"acc_norm_stderr": 0.023904914311782644
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114485,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114485
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.026552207828215286,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.026552207828215286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.17098445595854922,
"acc_stderr": 0.02717121368316453,
"acc_norm": 0.17098445595854922,
"acc_norm_stderr": 0.02717121368316453
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128002,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128002
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.025497532639609542,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.025497532639609542
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.027553614467863804,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.027553614467863804
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2036697247706422,
"acc_stderr": 0.0172667420876308,
"acc_norm": 0.2036697247706422,
"acc_norm_stderr": 0.0172667420876308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.027920963147993666,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.027920963147993666
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.03096451792692341,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.03096451792692341
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29535864978902954,
"acc_stderr": 0.029696338713422882,
"acc_norm": 0.29535864978902954,
"acc_norm_stderr": 0.029696338713422882
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.242152466367713,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.242152466367713,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.041032038305145124,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.041032038305145124
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915206,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915206
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891155,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891155
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2822477650063857,
"acc_stderr": 0.016095302969878565,
"acc_norm": 0.2822477650063857,
"acc_norm_stderr": 0.016095302969878565
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.024630048979824775,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.024630048979824775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2282958199356913,
"acc_stderr": 0.0238393033113982,
"acc_norm": 0.2282958199356913,
"acc_norm_stderr": 0.0238393033113982
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.0230167056402622,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.0230167056402622
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843007,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843007
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2470664928292047,
"acc_stderr": 0.011015752255279336,
"acc_norm": 0.2470664928292047,
"acc_norm_stderr": 0.011015752255279336
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3014705882352941,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.3014705882352941,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.01728276069516742,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.01728276069516742
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884601,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884601
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24489795918367346,
"acc_stderr": 0.02752963744017492,
"acc_norm": 0.24489795918367346,
"acc_norm_stderr": 0.02752963744017492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.26865671641791045,
"acc_stderr": 0.031343283582089536,
"acc_norm": 0.26865671641791045,
"acc_norm_stderr": 0.031343283582089536
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553026,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553026
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.25146198830409355,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.25146198830409355,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080515,
"mc2": 0.38469934472881057,
"mc2_stderr": 0.014966198091063187
},
"harness|winogrande|5": {
"acc": 0.5540647198105761,
"acc_stderr": 0.01397009348233069
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
zatepyakin/cc3m_min256_max512 | ---
license: unknown
---
|
open-llm-leaderboard/details_hfl__chinese-mixtral | ---
pretty_name: Evaluation run of hfl/chinese-mixtral
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hfl/chinese-mixtral](https://huggingface.co/hfl/chinese-mixtral) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hfl__chinese-mixtral\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-04T20:55:39.377397](https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-mixtral/blob/main/results_2024-02-04T20-55-39.377397.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6923714264378582,\n\
\ \"acc_stderr\": 0.03032741436858707,\n \"acc_norm\": 0.7058378526318035,\n\
\ \"acc_norm_stderr\": 0.031146777637985557,\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.01613222972815505,\n \"mc2\": 0.46858539506441044,\n\
\ \"mc2_stderr\": 0.014457363907207055\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6459044368600683,\n \"acc_stderr\": 0.013975454122756567,\n\
\ \"acc_norm\": 0.6757679180887372,\n \"acc_norm_stderr\": 0.01367881039951882\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6534554869547898,\n\
\ \"acc_stderr\": 0.004748965717214273,\n \"acc_norm\": 0.853415654252141,\n\
\ \"acc_norm_stderr\": 0.0035296822858572646\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.049020713000019756,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.049020713000019756\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.03391160934343603,\n\
\ \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.03391160934343603\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7849056603773585,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.7849056603773585,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n\
\ \"acc_stderr\": 0.030085743248565656,\n \"acc_norm\": 0.8472222222222222,\n\
\ \"acc_norm_stderr\": 0.030085743248565656\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7109826589595376,\n\
\ \"acc_stderr\": 0.034564257450869995,\n \"acc_norm\": 0.7109826589595376,\n\
\ \"acc_norm_stderr\": 0.034564257450869995\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.04940635630605659,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.04940635630605659\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6638297872340425,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.6638297872340425,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n\
\ \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n\
\ \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.04013124195424385,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.04013124195424385\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.455026455026455,\n \"acc_stderr\": 0.025646928361049395,\n \"\
acc_norm\": 0.455026455026455,\n \"acc_norm_stderr\": 0.025646928361049395\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8387096774193549,\n\
\ \"acc_stderr\": 0.020923327006423294,\n \"acc_norm\": 0.8387096774193549,\n\
\ \"acc_norm_stderr\": 0.020923327006423294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6009852216748769,\n \"acc_stderr\": 0.034454876862647144,\n\
\ \"acc_norm\": 0.6009852216748769,\n \"acc_norm_stderr\": 0.034454876862647144\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.030874145136562094,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.030874145136562094\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078912,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078912\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6974358974358974,\n \"acc_stderr\": 0.02329088805377273,\n \
\ \"acc_norm\": 0.6974358974358974,\n \"acc_norm_stderr\": 0.02329088805377273\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7857142857142857,\n \"acc_stderr\": 0.02665353159671549,\n \
\ \"acc_norm\": 0.7857142857142857,\n \"acc_norm_stderr\": 0.02665353159671549\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5033112582781457,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.5033112582781457,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8752293577981651,\n \"acc_stderr\": 0.014168298359156327,\n \"\
acc_norm\": 0.8752293577981651,\n \"acc_norm_stderr\": 0.014168298359156327\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6203703703703703,\n \"acc_stderr\": 0.03309682581119035,\n \"\
acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.03309682581119035\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8523206751054853,\n \"acc_stderr\": 0.023094329582595698,\n \
\ \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.023094329582595698\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7713004484304933,\n\
\ \"acc_stderr\": 0.028188240046929203,\n \"acc_norm\": 0.7713004484304933,\n\
\ \"acc_norm_stderr\": 0.028188240046929203\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547129,\n \"\
acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547129\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.03343270062869622,\n \"acc_norm\": 0.8611111111111112,\n\
\ \"acc_norm_stderr\": 0.03343270062869622\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.035865947385739734,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.035865947385739734\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.019119892798924978,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.019119892798924978\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8786717752234994,\n\
\ \"acc_stderr\": 0.01167591388390672,\n \"acc_norm\": 0.8786717752234994,\n\
\ \"acc_norm_stderr\": 0.01167591388390672\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7687861271676301,\n \"acc_stderr\": 0.02269865716785571,\n\
\ \"acc_norm\": 0.7687861271676301,\n \"acc_norm_stderr\": 0.02269865716785571\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3553072625698324,\n\
\ \"acc_stderr\": 0.01600698993480319,\n \"acc_norm\": 0.3553072625698324,\n\
\ \"acc_norm_stderr\": 0.01600698993480319\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8071895424836601,\n \"acc_stderr\": 0.022589318888176703,\n\
\ \"acc_norm\": 0.8071895424836601,\n \"acc_norm_stderr\": 0.022589318888176703\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7877813504823151,\n\
\ \"acc_stderr\": 0.02322275679743511,\n \"acc_norm\": 0.7877813504823151,\n\
\ \"acc_norm_stderr\": 0.02322275679743511\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8364197530864198,\n \"acc_stderr\": 0.020581466138257145,\n\
\ \"acc_norm\": 0.8364197530864198,\n \"acc_norm_stderr\": 0.020581466138257145\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5283687943262412,\n \"acc_stderr\": 0.029779450957303055,\n \
\ \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.029779450957303055\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5202086049543677,\n\
\ \"acc_stderr\": 0.012759801427767552,\n \"acc_norm\": 0.5202086049543677,\n\
\ \"acc_norm_stderr\": 0.012759801427767552\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7904411764705882,\n \"acc_stderr\": 0.02472311040767707,\n\
\ \"acc_norm\": 0.7904411764705882,\n \"acc_norm_stderr\": 0.02472311040767707\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7663398692810458,\n \"acc_stderr\": 0.017119158496044506,\n \
\ \"acc_norm\": 0.7663398692810458,\n \"acc_norm_stderr\": 0.017119158496044506\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7714285714285715,\n \"acc_stderr\": 0.026882144922307744,\n\
\ \"acc_norm\": 0.7714285714285715,\n \"acc_norm_stderr\": 0.026882144922307744\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276894,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276894\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.01613222972815505,\n \"mc2\": 0.46858539506441044,\n\
\ \"mc2_stderr\": 0.014457363907207055\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.010796468688068684\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/hfl/chinese-mixtral
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|arc:challenge|25_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|arc:challenge|25_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|gsm8k|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|gsm8k|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hellaswag|10_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hellaswag|10_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T07-43-13.375252.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T20-55-39.377397.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T20-55-39.377397.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- '**/details_harness|winogrande|5_2024-02-02T07-43-13.375252.parquet'
- split: 2024_02_04T20_55_39.377397
path:
- '**/details_harness|winogrande|5_2024-02-04T20-55-39.377397.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-04T20-55-39.377397.parquet'
- config_name: results
data_files:
- split: 2024_02_02T07_43_13.375252
path:
- results_2024-02-02T07-43-13.375252.parquet
- split: 2024_02_04T20_55_39.377397
path:
- results_2024-02-04T20-55-39.377397.parquet
- split: latest
path:
- results_2024-02-04T20-55-39.377397.parquet
---
# Dataset Card for Evaluation run of hfl/chinese-mixtral
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [hfl/chinese-mixtral](https://huggingface.co/hfl/chinese-mixtral) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hfl__chinese-mixtral",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T20:55:39.377397](https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-mixtral/blob/main/results_2024-02-04T20-55-39.377397.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6923714264378582,
"acc_stderr": 0.03032741436858707,
"acc_norm": 0.7058378526318035,
"acc_norm_stderr": 0.031146777637985557,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.01613222972815505,
"mc2": 0.46858539506441044,
"mc2_stderr": 0.014457363907207055
},
"harness|arc:challenge|25": {
"acc": 0.6459044368600683,
"acc_stderr": 0.013975454122756567,
"acc_norm": 0.6757679180887372,
"acc_norm_stderr": 0.01367881039951882
},
"harness|hellaswag|10": {
"acc": 0.6534554869547898,
"acc_stderr": 0.004748965717214273,
"acc_norm": 0.853415654252141,
"acc_norm_stderr": 0.0035296822858572646
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.049020713000019756,
"acc_norm": 0.39,
"acc_norm_stderr": 0.049020713000019756
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.03391160934343603,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.03391160934343603
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7849056603773585,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.7849056603773585,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565656,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565656
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.034564257450869995,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.034564257450869995
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.04940635630605659,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.04940635630605659
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6638297872340425,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.6638297872340425,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.04013124195424385,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.04013124195424385
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.455026455026455,
"acc_stderr": 0.025646928361049395,
"acc_norm": 0.455026455026455,
"acc_norm_stderr": 0.025646928361049395
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8387096774193549,
"acc_stderr": 0.020923327006423294,
"acc_norm": 0.8387096774193549,
"acc_norm_stderr": 0.020923327006423294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6009852216748769,
"acc_stderr": 0.034454876862647144,
"acc_norm": 0.6009852216748769,
"acc_norm_stderr": 0.034454876862647144
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.030874145136562094,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.030874145136562094
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078912,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078912
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6974358974358974,
"acc_stderr": 0.02329088805377273,
"acc_norm": 0.6974358974358974,
"acc_norm_stderr": 0.02329088805377273
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7857142857142857,
"acc_stderr": 0.02665353159671549,
"acc_norm": 0.7857142857142857,
"acc_norm_stderr": 0.02665353159671549
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5033112582781457,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.5033112582781457,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8752293577981651,
"acc_stderr": 0.014168298359156327,
"acc_norm": 0.8752293577981651,
"acc_norm_stderr": 0.014168298359156327
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.023094329582595698,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.023094329582595698
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7713004484304933,
"acc_stderr": 0.028188240046929203,
"acc_norm": 0.7713004484304933,
"acc_norm_stderr": 0.028188240046929203
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547129,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547129
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.03343270062869622,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.03343270062869622
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.035865947385739734,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.035865947385739734
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.019119892798924978,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.019119892798924978
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8786717752234994,
"acc_stderr": 0.01167591388390672,
"acc_norm": 0.8786717752234994,
"acc_norm_stderr": 0.01167591388390672
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7687861271676301,
"acc_stderr": 0.02269865716785571,
"acc_norm": 0.7687861271676301,
"acc_norm_stderr": 0.02269865716785571
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3553072625698324,
"acc_stderr": 0.01600698993480319,
"acc_norm": 0.3553072625698324,
"acc_norm_stderr": 0.01600698993480319
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8071895424836601,
"acc_stderr": 0.022589318888176703,
"acc_norm": 0.8071895424836601,
"acc_norm_stderr": 0.022589318888176703
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7877813504823151,
"acc_stderr": 0.02322275679743511,
"acc_norm": 0.7877813504823151,
"acc_norm_stderr": 0.02322275679743511
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8364197530864198,
"acc_stderr": 0.020581466138257145,
"acc_norm": 0.8364197530864198,
"acc_norm_stderr": 0.020581466138257145
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5202086049543677,
"acc_stderr": 0.012759801427767552,
"acc_norm": 0.5202086049543677,
"acc_norm_stderr": 0.012759801427767552
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7904411764705882,
"acc_stderr": 0.02472311040767707,
"acc_norm": 0.7904411764705882,
"acc_norm_stderr": 0.02472311040767707
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7663398692810458,
"acc_stderr": 0.017119158496044506,
"acc_norm": 0.7663398692810458,
"acc_norm_stderr": 0.017119158496044506
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7714285714285715,
"acc_stderr": 0.026882144922307744,
"acc_norm": 0.7714285714285715,
"acc_norm_stderr": 0.026882144922307744
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276894,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276894
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.01613222972815505,
"mc2": 0.46858539506441044,
"mc2_stderr": 0.014457363907207055
},
"harness|winogrande|5": {
"acc": 0.8200473559589582,
"acc_stderr": 0.010796468688068684
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_zarakiquemparte__zarafusionix-l2-7b | ---
pretty_name: Evaluation run of zarakiquemparte/zarafusionix-l2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [zarakiquemparte/zarafusionix-l2-7b](https://huggingface.co/zarakiquemparte/zarafusionix-l2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zarakiquemparte__zarafusionix-l2-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T19:56:11.100071](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zarafusionix-l2-7b/blob/main/results_2023-09-22T19-56-11.100071.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.20669043624161074,\n\
\ \"em_stderr\": 0.004146877317311672,\n \"f1\": 0.29368812919463155,\n\
\ \"f1_stderr\": 0.004195906469994281,\n \"acc\": 0.40933494018871774,\n\
\ \"acc_stderr\": 0.009672451208885371\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.20669043624161074,\n \"em_stderr\": 0.004146877317311672,\n\
\ \"f1\": 0.29368812919463155,\n \"f1_stderr\": 0.004195906469994281\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07202426080363912,\n \
\ \"acc_stderr\": 0.007121147983537124\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7466456195737964,\n \"acc_stderr\": 0.012223754434233618\n\
\ }\n}\n```"
repo_url: https://huggingface.co/zarakiquemparte/zarafusionix-l2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T19_56_11.100071
path:
- '**/details_harness|drop|3_2023-09-22T19-56-11.100071.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T19-56-11.100071.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T19_56_11.100071
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-56-11.100071.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-56-11.100071.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T19_56_11.100071
path:
- '**/details_harness|winogrande|5_2023-09-22T19-56-11.100071.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T19-56-11.100071.parquet'
- config_name: results
data_files:
- split: 2023_09_22T19_56_11.100071
path:
- results_2023-09-22T19-56-11.100071.parquet
- split: latest
path:
- results_2023-09-22T19-56-11.100071.parquet
---
# Dataset Card for Evaluation run of zarakiquemparte/zarafusionix-l2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/zarakiquemparte/zarafusionix-l2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [zarakiquemparte/zarafusionix-l2-7b](https://huggingface.co/zarakiquemparte/zarafusionix-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zarakiquemparte__zarafusionix-l2-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T19:56:11.100071](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zarafusionix-l2-7b/blob/main/results_2023-09-22T19-56-11.100071.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.20669043624161074,
"em_stderr": 0.004146877317311672,
"f1": 0.29368812919463155,
"f1_stderr": 0.004195906469994281,
"acc": 0.40933494018871774,
"acc_stderr": 0.009672451208885371
},
"harness|drop|3": {
"em": 0.20669043624161074,
"em_stderr": 0.004146877317311672,
"f1": 0.29368812919463155,
"f1_stderr": 0.004195906469994281
},
"harness|gsm8k|5": {
"acc": 0.07202426080363912,
"acc_stderr": 0.007121147983537124
},
"harness|winogrande|5": {
"acc": 0.7466456195737964,
"acc_stderr": 0.012223754434233618
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jonathanguedes26/yaratche | ---
license: openrail
---
|
portuguese-benchmark-datasets/xpaws_pt | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: int64
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 513914
num_examples: 2000
- name: validation
num_bytes: 512235
num_examples: 2000
download_size: 645673
dataset_size: 1026149
---
# Dataset Card for "xpaws_pt"
This is a portuguese translation of the [x-paws dataset](https://huggingface.co/datasets/paws-x). The translation was performed using the Google Translate API.
This dataset follows the same structure as the original. |
livinNector/ta-oscar-tokenizer-clean | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 9037331867
num_examples: 556772
download_size: 2891190241
dataset_size: 9037331867
---
# Dataset Card for "ta-oscar-tokenizer-clean"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
instabn/amelua | ---
license: other
license_name: esvd
license_link: LICENSE
---
|
greathero/evenmorex9-threeclass-newercontrailsvalidationdataset | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 11980603.0
num_examples: 400
download_size: 11803931
dataset_size: 11980603.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alkzar90/product-descriptions | ---
license: mit
---
|
mboth/luftVerteilen-50-undersampled | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: ScoreGrundfunktion
dtype: float64
- name: ZweiteGrundfunktion
dtype: string
- name: ScoreZweiteGrundfunktion
dtype: float64
- name: label
dtype:
class_label:
names:
'0': Auslass
'1': Raum
'2': VolumenstromreglerAbluft
'3': VolumenstromreglerRaum
'4': VolumenstromreglerZuluft
- name: Score
dtype: float64
splits:
- name: train
num_bytes: 60732.34410511364
num_examples: 237
- name: test
num_bytes: 91259
num_examples: 352
- name: valid
num_bytes: 91259
num_examples: 352
download_size: 99040
dataset_size: 243250.34410511365
---
# Dataset Card for "luftVerteilen-50-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cvrocha/neblina-audio | ---
license: openrail
---
|
CyberHarem/makomo_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of makomo (Pokémon)
This is the dataset of makomo (Pokémon), containing 109 images and their tags.
The core tags of this character are `glasses, hair_ornament, long_hair, hairclip, breasts, blue_eyes, black_hair, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 109 | 57.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makomo_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 109 | 43.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makomo_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 189 | 75.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makomo_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 109 | 54.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makomo_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 189 | 89.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/makomo_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/makomo_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, labcoat, hair_flower, smile, open_mouth, pokemon_(creature), purple_hair, blush, solo |
| 1 | 7 |  |  |  |  |  | 1boy, 1girl, hetero, blush, labcoat, open_mouth, penis, purple_eyes, solo_focus, heart, nipples, purple_hair, saliva, sex, censored, pussy, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | labcoat | hair_flower | smile | open_mouth | pokemon_(creature) | purple_hair | blush | solo | 1boy | hetero | penis | purple_eyes | solo_focus | heart | nipples | saliva | sex | censored | pussy | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:--------------|:--------|:-------------|:---------------------|:--------------|:--------|:-------|:-------|:---------|:--------|:--------------|:-------------|:--------|:----------|:---------|:------|:-----------|:--------|:----------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | | X | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X |
|
Nexdata/497_Images_English_Invoice_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
497 Images – English Invoice Data,the collection background is a solid color background, and personal information is desensitized, including various types of invoices, which can be used for tasks such as bill recognition and text recognition.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1392?source=Huggingface
## Date size
497 images
## Collecting environment
pure color background
## Data diversity
including multiple types of invoice
## Device
cellphone
## Data format
the image data is in .jpg format
# Licensing Information
Commercial License
|
gmltnwwkd/test5 | ---
dataset_info:
features:
- name: path
dtype: string
- name: sentence
dtype: string
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 1446220905.5255475
num_examples: 287
- name: test
num_bytes: 546941016.4744525
num_examples: 124
download_size: 1911648030
dataset_size: 1993161922.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "test5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
asaxena1990/datasetpreviewcsv | ---
license: cc-by-nc-sa-4.0
---
|
dlibf/ultra_feedback_zephyr-sft | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
splits:
- name: train_prefs
num_bytes: 405688662
num_examples: 61135
- name: test_prefs
num_bytes: 13161585
num_examples: 2000
- name: train_gen
num_bytes: 325040536
num_examples: 61135
- name: test_gen
num_bytes: 5337695
num_examples: 1000
download_size: 420356275
dataset_size: 749228478
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
- split: test_prefs
path: data/test_prefs-*
- split: train_gen
path: data/train_gen-*
- split: test_gen
path: data/test_gen-*
---
|
Amirjalaly/instructs-v.15 | ---
dataset_info:
features:
- name: response
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 121577803
num_examples: 62570
download_size: 50363962
dataset_size: 121577803
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
flaviolima/Flavio | ---
license: openrail
---
|
llm-lens/vocab_tags | ---
dataset_info:
features:
- name: prompt_descriptions
dtype: string
splits:
- name: train
num_bytes: 346971
num_examples: 22131
download_size: 298971
dataset_size: 346971
---
# Dataset Card for "vocab_tags"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MAsad789565/6735673359988736 | ---
dataset_info:
features:
- name: user
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 6838
num_examples: 4
- name: test
num_bytes: 3279
num_examples: 1
download_size: 40316
dataset_size: 10117
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
DataStudio/AudioVietnameseVoice | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: content
dtype: string
splits:
- name: train
num_bytes: 741433517.0
num_examples: 581
download_size: 722755135
dataset_size: 741433517.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DylanJHJ/temp | ---
license: apache-2.0
---
|
CVasNLPExperiments/Hatefulmemes_test_google_flan_t5_xxl_mode_CM_T_A_OCR_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__text
num_bytes: 2026125
num_examples: 1000
download_size: 415327
dataset_size: 2026125
---
# Dataset Card for "Hatefulmemes_test_google_flan_t5_xxl_mode_CM_T_A_OCR_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_160m_bo16_2_mix_50_kl_0.1_prm_160m_thr_0.0_seed_2 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43800231
num_examples: 18928
- name: epoch_1
num_bytes: 44360861
num_examples: 18928
- name: epoch_2
num_bytes: 44434638
num_examples: 18928
- name: epoch_3
num_bytes: 44453497
num_examples: 18928
- name: epoch_4
num_bytes: 44453617
num_examples: 18928
- name: epoch_5
num_bytes: 44449467
num_examples: 18928
- name: epoch_6
num_bytes: 44436339
num_examples: 18928
- name: epoch_7
num_bytes: 44425307
num_examples: 18928
- name: epoch_8
num_bytes: 44419489
num_examples: 18928
- name: epoch_9
num_bytes: 44416273
num_examples: 18928
- name: epoch_10
num_bytes: 44413967
num_examples: 18928
- name: epoch_11
num_bytes: 44411729
num_examples: 18928
- name: epoch_12
num_bytes: 44409934
num_examples: 18928
- name: epoch_13
num_bytes: 44409443
num_examples: 18928
- name: epoch_14
num_bytes: 44407291
num_examples: 18928
- name: epoch_15
num_bytes: 44406704
num_examples: 18928
- name: epoch_16
num_bytes: 44406563
num_examples: 18928
- name: epoch_17
num_bytes: 44405602
num_examples: 18928
- name: epoch_18
num_bytes: 44407357
num_examples: 18928
- name: epoch_19
num_bytes: 44405259
num_examples: 18928
- name: epoch_20
num_bytes: 44407013
num_examples: 18928
- name: epoch_21
num_bytes: 44407420
num_examples: 18928
- name: epoch_22
num_bytes: 44406446
num_examples: 18928
- name: epoch_23
num_bytes: 44406607
num_examples: 18928
- name: epoch_24
num_bytes: 44404891
num_examples: 18928
- name: epoch_25
num_bytes: 44405934
num_examples: 18928
- name: epoch_26
num_bytes: 44407169
num_examples: 18928
- name: epoch_27
num_bytes: 44404320
num_examples: 18928
- name: epoch_28
num_bytes: 44406013
num_examples: 18928
- name: epoch_29
num_bytes: 44404414
num_examples: 18928
download_size: 701318354
dataset_size: 1331793795
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
arubenruben/segundo_harem_conll_2003_style | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-MISC
'8': I-MISC
splits:
- name: train
num_bytes: 1047476
num_examples: 93
- name: validation
num_bytes: 249755
num_examples: 23
download_size: 295815
dataset_size: 1297231
---
# Dataset Card for "segundo_harem_conll_2003_style"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/leizi_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of leizi/レイズ/惊蛰 (Arknights)
This is the dataset of leizi/レイズ/惊蛰 (Arknights), containing 125 images and their tags.
The core tags of this character are `long_hair, blonde_hair, horns, pointy_ears, breasts, blue_eyes, purple_eyes, hair_ornament, hair_between_eyes, very_long_hair, ahoge`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 125 | 218.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leizi_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 125 | 182.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leizi_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 311 | 360.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leizi_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/leizi_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, bare_shoulders, solo, black_dress, black_gloves, fingerless_gloves, looking_at_viewer, open_clothes, simple_background, long_sleeves, off_shoulder, sleeveless_dress, black_pantyhose, id_card, white_background, holding_staff, tail, white_jacket, cowboy_shot, hairclip, medium_breasts |
| 1 | 7 |  |  |  |  |  | 1girl, solo, upper_body, bare_shoulders, black_gloves, looking_at_viewer, simple_background, sleeveless, white_background, fingerless_gloves, blush, double_bun, black_shirt, closed_mouth, hairclip, infection_monitor_(arknights), open_mouth |
| 2 | 9 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, sleeveless, solo, upper_body, id_card, jacket, off_shoulder, black_gloves, double_bun, medium_breasts, shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | solo | black_dress | black_gloves | fingerless_gloves | looking_at_viewer | open_clothes | simple_background | long_sleeves | off_shoulder | sleeveless_dress | black_pantyhose | id_card | white_background | holding_staff | tail | white_jacket | cowboy_shot | hairclip | medium_breasts | upper_body | sleeveless | blush | double_bun | black_shirt | closed_mouth | infection_monitor_(arknights) | open_mouth | jacket | shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:--------------|:---------------|:--------------------|:--------------------|:---------------|:--------------------|:---------------|:---------------|:-------------------|:------------------|:----------|:-------------------|:----------------|:-------|:---------------|:--------------|:-----------|:-----------------|:-------------|:-------------|:--------|:-------------|:--------------|:---------------|:--------------------------------|:-------------|:---------|:--------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | X | X | X | | X | | | | | | X | | | | | X | | X | X | X | X | X | X | X | X | | |
| 2 | 9 |  |  |  |  |  | X | X | X | | X | | X | | | | X | | | X | | | | | | | X | X | X | | X | | | | | X | X |
|
atgarcia/valDataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: emg
sequence:
sequence: float64
splits:
- name: train
num_bytes: 754328716
num_examples: 547
download_size: 280899805
dataset_size: 754328716
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zolak/twitter_dataset_1713009506 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 3224063
num_examples: 7908
download_size: 1598608
dataset_size: 3224063
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/mogami_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mogami/最上/最上 (Azur Lane)
This is the dataset of mogami/最上/最上 (Azur Lane), containing 19 images and their tags.
The core tags of this character are `brown_hair, horns, single_horn, pointy_ears, breasts, red_eyes, long_hair, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 18.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mogami_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 13.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mogami_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 39 | 23.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mogami_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 17.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mogami_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 39 | 28.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mogami_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mogami_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, detached_sleeves, thighhighs, white_background, wide_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | detached_sleeves | thighhighs | white_background | wide_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:-------------------|:-------------|:-------------------|:---------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X |
|
hon9kon9ize/38k-zh-yue-translation-llm-generated | ---
license: cc-by-nc-sa-4.0
dataset_info:
features:
- name: zh
dtype: string
- name: yue
dtype: string
splits:
- name: train
num_bytes: 6642874
num_examples: 38142
- name: test
num_bytes: 2210155
num_examples: 12170
download_size: 5922293
dataset_size: 8853029
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
This dataset consists of Chinese (Simplified) to Cantonese translation pairs generated using large language models (LLMs) and translated by Google Palm2. The dataset aims to provide a collection of translated sentences for training and evaluating Chinese (Simplified) to Cantonese translation models.
The dataset creation process involved two main steps:
LLM Sentence Generation: ChatGPT, a powerful LLM, was utilized to generate 10 sentences for each term pair. These sentences were generated in Chinese (Simplified) and were designed to encompass diverse contexts and language patterns.
Translation with Google Palm2: The Chinese (Simplified) sentences generated by ChatGPT were then translated into Cantonese using the Google Palm2 translation model. This step ensured the creation of accurate translation pairs for the dataset.
More detail please visit our [blog post](https://hon9kon9ize.com/posts/2023-12-11-low-resource-language)
## Limitations and Usage
It is important to note the following limitations and considerations regarding this dataset:
Limited Contextual Understanding: As the dataset is generated using language models, such as ChatGPT, it may have limited contextual understanding. The generated sentences may not always capture nuanced meanings or specific domain knowledge accurately.
Automated Translation: The translation process was performed using the Google Palm2 translation model. While efforts were made to ensure accurate translations, there may still be instances where the translations are not entirely precise or may not reflect certain regional variations.
Lack of Manual Proofreading: The dataset has not undergone manual proofreading or human validation. As a result, it is possible that some translations may contain errors, inconsistencies, or inappropriate or harmful words generated by the LLMs.
Users of this dataset should exercise caution and implement appropriate filtering or post-processing techniques to address any potential issues related to accuracy, appropriateness, or harmful language.
|
nnngoc/data_test_2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 17851
num_examples: 38
download_size: 9964
dataset_size: 17851
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Sinias/DominicCritelli | ---
license: afl-3.0
---
|
panopstor/nvflickritw-cogvlm-captions | ---
license: cc0-1.0
---
This dataset is captions-only for 45k images from the Nvidia Flickr "In the wild" dataset. (https://github.com/NVlabs/ffhq-dataset).
Captions here are provided here under CC0 license as I believe model outputs for all captioning models used do not fall under the models' licenses.
Check the Nvidia flickr dataset URL for information on use restrictions and copyright for the images in the dataset itself.
Captions are in .txt with the same basename as the associated image. Created using CogVLM chat model. (https://huggingface.co/THUDM/cogvl). CogVLM captions were run on an RTX 6000 Ada taking a few days as each takes 5-8 seconds.
Script to run: `https://github.com/victorchall/EveryDream2trainer/blob/main/caption_cog.py`
Command used:
```python caption_cog.py --image_dir /mnt/q/mldata/nvidia-flickr-itw --num_beams 3 --top_k 45 --top_p 0.9 --temp 0.95 --prompt "Write a concise, accurate, blunt, and detailed description. Avoid euphemisms, vague wording, or ambiguous expressions. Do not exceed 21 words." ```
Captions from blip1 beam, blip1 nucleus, and blip2 6.7b (default) are also provided. See: https://github.com/salesforce/LAVIS for information on BLIP and BLIP2.
The BLIP 1/2 captions were run quite a while ago, and to be honest I don't recall full details.
Raw .txt files are provided in zip files chunked by 1000 images each for use with img/txt pair file-based dataloaders, or shoving into webdataset tar. These correspond to the original data set which is provided as images only as `[00000..44999].png`.
Parquet file should be obvious from there and you can integrate or transform as needed. |
Limour/b-corpus | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-generation
language:
- zh
tags:
- not-for-all-audiences
---
纯手工用眼睛和手细细切做臊子的中文长文本语料
下载:`$env:HF_ENDPOINT="https://hf-mirror.com"; python -c "from huggingface_hub import snapshot_download; snapshot_download(repo_id='Limour/b-corpus', repo_type='dataset', local_dir=r'D:\datasets\tmp')"`
1. 清洗要求:`全角转半角` | `繁体转简体`
2. 内部去重:`s/(.)\1{3,}/$1$1$1/g` | `s/(.{2,}?)\1{2,}/$1/g` | `s/(((^.*$)[\r\n]*){1,10}?)\1{1,}/$1/g`
3. 杂项:`s/^([\x00-\x3e\x40-\xff]{1,4})[\x00-\xff]*:/$1:/g`
4. 错字:`s/巴巴/爸爸/g` | `s/阿阿+/啊啊/g` | `s/很抄/很吵/g` | `s/能苟/能够/g`
5. 错字:`s/拉\b/啦/g` | `s/巴\b/吧/g` | `s/阿\b/啊/g`
6. 一个完整对话为一个文件
7. 一行的格式为 `{NAME}:{DIALOGUE}`(':'为中文冒号)
8. 旁白的 {NAME} 为 `旁白`
9. 未知人物的 {NAME} 为 `?`
10. 可以从旁白推断的主角的 {NAME} 为 `我/名字`,否则为 `名字`
11. 如万华镜等主角名字改变的,`名字` 变,`我/` 不变
12. 除 `b-corpus\视觉小说\format` 外的语料的作用是增加多样性
13. 完整保留涩涩内容,部分内容涉及*错误世界观和道德伦理*
14. 注意:部分多视角的语料,随着旁白的改变,主角也可能会改变
15. `b-corpus\v-corpus-en` 来自 [alpindale](https://huggingface.co/alpindale)/[visual-novels](https://huggingface.co/datasets/alpindale/visual-novels), 或许可以翻译成中文?
16. 将数据按 `制作会社\作品名` 进行了整理,并修复了一些小错误,保存在 `v-corpus-zh` 目录下
```python
from opencc import OpenCC
cc = OpenCC('t2s') # 't2s'表示繁体转简体
import unicodedata
def fullwidth_to_halfwidth(input_str):
return ''.join((unicodedata.normalize('NFKC', char) for char in input_str))
def clearT(s):
s = cc.convert(fullwidth_to_halfwidth(s))
return s.strip()
```
+ 错误价值观举例
```txt
旁白:她抵达了终之空。
旁白:她已经超越了万物....
旁白:超越万物....
旁白:也就是,
旁白:抵达极致....
女信徒A:野崎跳下去了!
女信徒A:真棒。我们跟上吧!趁俗世的权利还没阻止我们抵达极致!!
女信徒A:救世主大人
女信徒A:我先走一步
我/卓司:嗯
女性信者B:我也走了
女信徒C:我也....各位,保重
男信徒A:我也要....跟这不完美的世界说再见了。各位,再见....
由香:在完美的世界里,我们也要在一起,
?:嗯——
由香:再见,救世主大人。非常感谢
我/卓司:嗯....
男信徒B:这样就能跟这个世界说再见了....我讨厌这个世界
男信徒B:以后就能在完美的世界里——
男信徒B:在完美的世界里,过上幸福的日子!!
男信徒B:没有家人
男信徒B:没有老师
男信徒B:没有考试
男信徒B:也没有学校
男信徒C:完美的世界,我来了!!
旁白:....信徒们一个个地抵达终之空....
```
```txt
我/卓司:真的会有祝福自己诞生的人吗?
我/卓司:正因为诅咒一切,在这个世界诞生,因为一切都是谬误,我们才——
行人:是啊,没错,是这样。如果诞生是惩罚的话,我们在诞生的瞬间就是丧家犬了....
我/卓司:那为什么——
行人:所以就要勒紧刚诞生的婴儿的脖子?
旁白:我有些惊讶。
旁白:“掐住新生婴儿的脖子,将其人生在10分钟结束,谁也不会有异议”
旁白:虽然这的确是我说的,但那是在方舟的演说上说的。我不认为行人会知道。
旁白:这样的话。
我/卓司:我们在思考同样的事?
行人:谁知道呢?但我能断言,你的所作所为是错误的
旁白:即使如此,也要断定我是错的。
旁白:虽然很想听听他的理由,但若他是最后一人,也就没必要听了。
旁白:因为这是我在思考的事,而水上行人与我的判断完全相反——
我/卓司:被诅咒的,生
我/卓司:被祝福的,生
我/卓司:亦或者是——
我/卓司:被当做诅咒的,死
我/卓司:把这样的死,当做祝福接受
我/卓司:到这里为止,我和你的想法应该是一样的吧?
```
+ 涩涩内容举例
```txt
我/块斗:「哪个才是本体啊!?」
月咏:「都是本体哦。这是将自身的存在多次复制的结果」
月咏:「现技术已经支持了,只要大脑适应,就能复制多个自己,并随意行动」
月咏:「就好比自己的神经伸展到外部了的感觉」
我/块斗:「嚯嚯......」
旁白:我也能做到这样的吗......?
月咏:「我一开始就是完全适应的状态......所以这种事也能做到」
旁白:齐刷刷走来的月咏大军将我围住,并把我推倒。
我/块斗:「哇......怎么了?」
月咏:「......在这里做爱的话,就不会对身体有负担了」
月咏:「但是,这样会将大脑的感受更直接的引导出来」
我/块斗:「......也就是说?」
月咏:「......非常抱歉,主人」
月咏:「我......大概是个相当色情的女孩子」
我/块斗:「放心吧。我对你的感情绝对不比你差」
月咏:「好开心......请让我好好侍奉您一番吧」
旁白:两侧的月咏靠近过来,并将她们的嘴唇贴了上来。
月咏1:「嗯......呼......啾......今天,请尽情享受吧......啾......」
旁白:站在身旁的月咏吸住我的嘴唇。
旁白:而且还抓住我的手往自己胸上压。
月咏1:「哈噗......啾、嗯嗯......啊哈......主人,你知道,啾......我现在心脏砰砰跳个不停吗......?」
月咏2:「要这样说,我的心脏现在也......扑通扑通地跳......主人......嗯嗯、请你确认一下......」
旁白:位于另一侧的月咏则是把我的手放在私处。
旁白:指尖有种湿润的触感。
月咏2:「啊......嗯、嗯......主人的手指......啊、啊......我的阴道里面,躁动不停......嗯嗯......好舒服......」
月咏1:「也请尽情摸我的胸部......嗯、啾、嗯溜噜......溜噜、啾......!」
旁白:像是为了对抗让我触摸秘部的月咏,另一个月咏吸住了我的舌头。
我/块斗:「啾......哈啊、哈啊......呜喔喔!?」
月咏3:「呼......嗯、嗯......呼呼、主人,舒服吗......?」
旁白:往身下看去,发现第三个月咏用胸部夹住了我的股间。
```
## 待大佬汉化中
+ [樱之刻](https://github.com/kono-dada/Sakuranotoki-Chinese) |
niv-al/instruct_sq_600k | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 544244075
num_examples: 586394
download_size: 84724923
dataset_size: 544244075
license: openrail
task_categories:
- question-answering
- table-question-answering
- summarization
- text2text-generation
language:
- sq
size_categories:
- 100K<n<1M
---
# Dataset Card for "instruct_sq_600k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nadav/pixel_glue_wnli_noisy_ocr | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 585329
num_examples: 3175
- name: validation
num_bytes: 14140
num_examples: 71
download_size: 328593
dataset_size: 599469
---
# Dataset Card for "pixel_glue_wnli_noisy_ocr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_LLMs__AlpacaGPT4-7B-elina | ---
pretty_name: Evaluation run of LLMs/AlpacaGPT4-7B-elina
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LLMs/AlpacaGPT4-7B-elina](https://huggingface.co/LLMs/AlpacaGPT4-7B-elina) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LLMs__AlpacaGPT4-7B-elina\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T04:06:06.586475](https://huggingface.co/datasets/open-llm-leaderboard/details_LLMs__AlpacaGPT4-7B-elina/blob/main/results_2023-10-15T04-06-06.586475.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003460570469798658,\n\
\ \"em_stderr\": 0.0006013962884271144,\n \"f1\": 0.06020763422818805,\n\
\ \"f1_stderr\": 0.001415436583944496,\n \"acc\": 0.38620148841562185,\n\
\ \"acc_stderr\": 0.009130838881295832\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.003460570469798658,\n \"em_stderr\": 0.0006013962884271144,\n\
\ \"f1\": 0.06020763422818805,\n \"f1_stderr\": 0.001415436583944496\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.045489006823351025,\n \
\ \"acc_stderr\": 0.005739657656722211\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7269139700078927,\n \"acc_stderr\": 0.012522020105869454\n\
\ }\n}\n```"
repo_url: https://huggingface.co/LLMs/AlpacaGPT4-7B-elina
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|arc:challenge|25_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T04_06_06.586475
path:
- '**/details_harness|drop|3_2023-10-15T04-06-06.586475.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T04-06-06.586475.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T04_06_06.586475
path:
- '**/details_harness|gsm8k|5_2023-10-15T04-06-06.586475.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T04-06-06.586475.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hellaswag|10_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T12:21:37.483871.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T12:21:37.483871.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T12:21:37.483871.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T04_06_06.586475
path:
- '**/details_harness|winogrande|5_2023-10-15T04-06-06.586475.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T04-06-06.586475.parquet'
- config_name: results
data_files:
- split: 2023_07_18T12_21_37.483871
path:
- results_2023-07-18T12:21:37.483871.parquet
- split: 2023_10_15T04_06_06.586475
path:
- results_2023-10-15T04-06-06.586475.parquet
- split: latest
path:
- results_2023-10-15T04-06-06.586475.parquet
---
# Dataset Card for Evaluation run of LLMs/AlpacaGPT4-7B-elina
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/LLMs/AlpacaGPT4-7B-elina
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [LLMs/AlpacaGPT4-7B-elina](https://huggingface.co/LLMs/AlpacaGPT4-7B-elina) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LLMs__AlpacaGPT4-7B-elina",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T04:06:06.586475](https://huggingface.co/datasets/open-llm-leaderboard/details_LLMs__AlpacaGPT4-7B-elina/blob/main/results_2023-10-15T04-06-06.586475.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.003460570469798658,
"em_stderr": 0.0006013962884271144,
"f1": 0.06020763422818805,
"f1_stderr": 0.001415436583944496,
"acc": 0.38620148841562185,
"acc_stderr": 0.009130838881295832
},
"harness|drop|3": {
"em": 0.003460570469798658,
"em_stderr": 0.0006013962884271144,
"f1": 0.06020763422818805,
"f1_stderr": 0.001415436583944496
},
"harness|gsm8k|5": {
"acc": 0.045489006823351025,
"acc_stderr": 0.005739657656722211
},
"harness|winogrande|5": {
"acc": 0.7269139700078927,
"acc_stderr": 0.012522020105869454
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tbboukhari/Alpaca-in-french | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: instruction
dtype: string
- name: ' saisir'
dtype: string
- name: ' sortir'
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 23689208
num_examples: 52002
download_size: 14446335
dataset_size: 23689208
---
# Dataset Card for "Alpaca-in-french"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arbml/dummy | ---
dataset_info:
features:
- name: name
dtype: string
- name: age
dtype: string
- name: label
dtype:
class_label:
names:
'0': female
'1': male
splits:
- name: train
num_bytes: 50
num_examples: 2
download_size: 1182
dataset_size: 50
---
# Dataset Card for "dummy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_cloudyu__Yi-34Bx2-MoE-60B | ---
pretty_name: Evaluation run of cloudyu/Yi-34Bx2-MoE-60B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cloudyu/Yi-34Bx2-MoE-60B](https://huggingface.co/cloudyu/Yi-34Bx2-MoE-60B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__Yi-34Bx2-MoE-60B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-11T00:14:54.121598](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Yi-34Bx2-MoE-60B/blob/main/results_2024-01-11T00-14-54.121598.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7719265002005771,\n\
\ \"acc_stderr\": 0.027890629800356333,\n \"acc_norm\": 0.7749305083860206,\n\
\ \"acc_norm_stderr\": 0.0284361463203916,\n \"mc1\": 0.49326805385556916,\n\
\ \"mc1_stderr\": 0.01750191449265539,\n \"mc2\": 0.6619082030385652,\n\
\ \"mc2_stderr\": 0.014547333891309428\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6723549488054608,\n \"acc_stderr\": 0.01371584794071934,\n\
\ \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393443\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6537542322246565,\n\
\ \"acc_stderr\": 0.00474800327646621,\n \"acc_norm\": 0.852320254929297,\n\
\ \"acc_norm_stderr\": 0.0035405716545956313\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7333333333333333,\n\
\ \"acc_stderr\": 0.038201699145179055,\n \"acc_norm\": 0.7333333333333333,\n\
\ \"acc_norm_stderr\": 0.038201699145179055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \
\ \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n\
\ \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.024618298195866514,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.024618298195866514\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.875,\n\
\ \"acc_stderr\": 0.02765610492929436,\n \"acc_norm\": 0.875,\n \
\ \"acc_norm_stderr\": 0.02765610492929436\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.67,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7630057803468208,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.7630057803468208,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.026355158413349417,\n\
\ \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.026355158413349417\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.6140350877192983,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7655172413793103,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.7655172413793103,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.7275132275132276,\n \"acc_stderr\": 0.022930973071633363,\n \"\
acc_norm\": 0.7275132275132276,\n \"acc_norm_stderr\": 0.022930973071633363\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5873015873015873,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.5873015873015873,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.9032258064516129,\n \"acc_stderr\": 0.016818943416345197,\n \"\
acc_norm\": 0.9032258064516129,\n \"acc_norm_stderr\": 0.016818943416345197\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6305418719211823,\n \"acc_stderr\": 0.03395970381998573,\n \"\
acc_norm\": 0.6305418719211823,\n \"acc_norm_stderr\": 0.03395970381998573\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\"\
: 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n\
\ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9292929292929293,\n \"acc_stderr\": 0.018263105420199505,\n \"\
acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.018263105420199505\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n\
\ \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.823076923076923,\n \"acc_stderr\": 0.019348070174396995,\n \
\ \"acc_norm\": 0.823076923076923,\n \"acc_norm_stderr\": 0.019348070174396995\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.44814814814814813,\n \"acc_stderr\": 0.030321167196316286,\n \
\ \"acc_norm\": 0.44814814814814813,\n \"acc_norm_stderr\": 0.030321167196316286\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.023274255898707946,\n\
\ \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.023274255898707946\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248437,\n \"\
acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248437\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9155963302752294,\n \"acc_stderr\": 0.011918819327334877,\n \"\
acc_norm\": 0.9155963302752294,\n \"acc_norm_stderr\": 0.011918819327334877\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6805555555555556,\n \"acc_stderr\": 0.03179876342176851,\n \"\
acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.03179876342176851\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658935,\n \"\
acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658935\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065522,\n \
\ \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065522\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n\
\ \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n\
\ \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.9083969465648855,\n \"acc_stderr\": 0.025300035578642962,\n\
\ \"acc_norm\": 0.9083969465648855,\n \"acc_norm_stderr\": 0.025300035578642962\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"\
acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8703703703703703,\n\
\ \"acc_stderr\": 0.03247224389917947,\n \"acc_norm\": 0.8703703703703703,\n\
\ \"acc_norm_stderr\": 0.03247224389917947\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8834355828220859,\n \"acc_stderr\": 0.025212327210507104,\n\
\ \"acc_norm\": 0.8834355828220859,\n \"acc_norm_stderr\": 0.025212327210507104\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6428571428571429,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.6428571428571429,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.912621359223301,\n \"acc_stderr\": 0.027960689125970654,\n\
\ \"acc_norm\": 0.912621359223301,\n \"acc_norm_stderr\": 0.027960689125970654\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n\
\ \"acc_stderr\": 0.015006312806446912,\n \"acc_norm\": 0.9444444444444444,\n\
\ \"acc_norm_stderr\": 0.015006312806446912\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9080459770114943,\n\
\ \"acc_stderr\": 0.010333225570778521,\n \"acc_norm\": 0.9080459770114943,\n\
\ \"acc_norm_stderr\": 0.010333225570778521\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8265895953757225,\n \"acc_stderr\": 0.020383229551135026,\n\
\ \"acc_norm\": 0.8265895953757225,\n \"acc_norm_stderr\": 0.020383229551135026\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8100558659217877,\n\
\ \"acc_stderr\": 0.01311902831049268,\n \"acc_norm\": 0.8100558659217877,\n\
\ \"acc_norm_stderr\": 0.01311902831049268\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8594771241830066,\n \"acc_stderr\": 0.019899435463539946,\n\
\ \"acc_norm\": 0.8594771241830066,\n \"acc_norm_stderr\": 0.019899435463539946\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8263665594855305,\n\
\ \"acc_stderr\": 0.021514051585970403,\n \"acc_norm\": 0.8263665594855305,\n\
\ \"acc_norm_stderr\": 0.021514051585970403\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8765432098765432,\n \"acc_stderr\": 0.01830386880689179,\n\
\ \"acc_norm\": 0.8765432098765432,\n \"acc_norm_stderr\": 0.01830386880689179\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6418439716312057,\n \"acc_stderr\": 0.028602085862759422,\n \
\ \"acc_norm\": 0.6418439716312057,\n \"acc_norm_stderr\": 0.028602085862759422\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6003911342894394,\n\
\ \"acc_stderr\": 0.012510181636960679,\n \"acc_norm\": 0.6003911342894394,\n\
\ \"acc_norm_stderr\": 0.012510181636960679\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02315746830855936,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02315746830855936\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8235294117647058,\n \"acc_stderr\": 0.015422512066262554,\n \
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.015422512066262554\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8489795918367347,\n \"acc_stderr\": 0.02292300409473685,\n\
\ \"acc_norm\": 0.8489795918367347,\n \"acc_norm_stderr\": 0.02292300409473685\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\
\ \"acc_stderr\": 0.021166216304659393,\n \"acc_norm\": 0.900497512437811,\n\
\ \"acc_norm_stderr\": 0.021166216304659393\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072867,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072867\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.49326805385556916,\n\
\ \"mc1_stderr\": 0.01750191449265539,\n \"mc2\": 0.6619082030385652,\n\
\ \"mc2_stderr\": 0.014547333891309428\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571748\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.755117513267627,\n \
\ \"acc_stderr\": 0.011844819027863673\n }\n}\n```"
repo_url: https://huggingface.co/cloudyu/Yi-34Bx2-MoE-60B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|arc:challenge|25_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|gsm8k|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hellaswag|10_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T00-14-54.121598.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-11T00-14-54.121598.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- '**/details_harness|winogrande|5_2024-01-11T00-14-54.121598.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-11T00-14-54.121598.parquet'
- config_name: results
data_files:
- split: 2024_01_11T00_14_54.121598
path:
- results_2024-01-11T00-14-54.121598.parquet
- split: latest
path:
- results_2024-01-11T00-14-54.121598.parquet
---
# Dataset Card for Evaluation run of cloudyu/Yi-34Bx2-MoE-60B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/Yi-34Bx2-MoE-60B](https://huggingface.co/cloudyu/Yi-34Bx2-MoE-60B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__Yi-34Bx2-MoE-60B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-11T00:14:54.121598](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Yi-34Bx2-MoE-60B/blob/main/results_2024-01-11T00-14-54.121598.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7719265002005771,
"acc_stderr": 0.027890629800356333,
"acc_norm": 0.7749305083860206,
"acc_norm_stderr": 0.0284361463203916,
"mc1": 0.49326805385556916,
"mc1_stderr": 0.01750191449265539,
"mc2": 0.6619082030385652,
"mc2_stderr": 0.014547333891309428
},
"harness|arc:challenge|25": {
"acc": 0.6723549488054608,
"acc_stderr": 0.01371584794071934,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393443
},
"harness|hellaswag|10": {
"acc": 0.6537542322246565,
"acc_stderr": 0.00474800327646621,
"acc_norm": 0.852320254929297,
"acc_norm_stderr": 0.0035405716545956313
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.038201699145179055,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.038201699145179055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8,
"acc_stderr": 0.024618298195866514,
"acc_norm": 0.8,
"acc_norm_stderr": 0.024618298195866514
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.875,
"acc_stderr": 0.02765610492929436,
"acc_norm": 0.875,
"acc_norm_stderr": 0.02765610492929436
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.026355158413349417,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.026355158413349417
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7655172413793103,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.7655172413793103,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7275132275132276,
"acc_stderr": 0.022930973071633363,
"acc_norm": 0.7275132275132276,
"acc_norm_stderr": 0.022930973071633363
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5873015873015873,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.5873015873015873,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9032258064516129,
"acc_stderr": 0.016818943416345197,
"acc_norm": 0.9032258064516129,
"acc_norm_stderr": 0.016818943416345197
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6305418719211823,
"acc_stderr": 0.03395970381998573,
"acc_norm": 0.6305418719211823,
"acc_norm_stderr": 0.03395970381998573
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.018263105420199505,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.018263105420199505
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527033,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527033
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.823076923076923,
"acc_stderr": 0.019348070174396995,
"acc_norm": 0.823076923076923,
"acc_norm_stderr": 0.019348070174396995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.44814814814814813,
"acc_stderr": 0.030321167196316286,
"acc_norm": 0.44814814814814813,
"acc_norm_stderr": 0.030321167196316286
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8487394957983193,
"acc_stderr": 0.023274255898707946,
"acc_norm": 0.8487394957983193,
"acc_norm_stderr": 0.023274255898707946
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9155963302752294,
"acc_stderr": 0.011918819327334877,
"acc_norm": 0.9155963302752294,
"acc_norm_stderr": 0.011918819327334877
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.03179876342176851,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.03179876342176851
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658935,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658935
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065522,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065522
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9083969465648855,
"acc_stderr": 0.025300035578642962,
"acc_norm": 0.9083969465648855,
"acc_norm_stderr": 0.025300035578642962
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.027285246312758957,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.027285246312758957
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.03247224389917947,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.03247224389917947
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8834355828220859,
"acc_stderr": 0.025212327210507104,
"acc_norm": 0.8834355828220859,
"acc_norm_stderr": 0.025212327210507104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.912621359223301,
"acc_stderr": 0.027960689125970654,
"acc_norm": 0.912621359223301,
"acc_norm_stderr": 0.027960689125970654
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446912,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446912
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9080459770114943,
"acc_stderr": 0.010333225570778521,
"acc_norm": 0.9080459770114943,
"acc_norm_stderr": 0.010333225570778521
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8265895953757225,
"acc_stderr": 0.020383229551135026,
"acc_norm": 0.8265895953757225,
"acc_norm_stderr": 0.020383229551135026
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8100558659217877,
"acc_stderr": 0.01311902831049268,
"acc_norm": 0.8100558659217877,
"acc_norm_stderr": 0.01311902831049268
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8594771241830066,
"acc_stderr": 0.019899435463539946,
"acc_norm": 0.8594771241830066,
"acc_norm_stderr": 0.019899435463539946
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8263665594855305,
"acc_stderr": 0.021514051585970403,
"acc_norm": 0.8263665594855305,
"acc_norm_stderr": 0.021514051585970403
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8765432098765432,
"acc_stderr": 0.01830386880689179,
"acc_norm": 0.8765432098765432,
"acc_norm_stderr": 0.01830386880689179
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6418439716312057,
"acc_stderr": 0.028602085862759422,
"acc_norm": 0.6418439716312057,
"acc_norm_stderr": 0.028602085862759422
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6003911342894394,
"acc_stderr": 0.012510181636960679,
"acc_norm": 0.6003911342894394,
"acc_norm_stderr": 0.012510181636960679
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02315746830855936,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02315746830855936
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.015422512066262554,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.015422512066262554
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8489795918367347,
"acc_stderr": 0.02292300409473685,
"acc_norm": 0.8489795918367347,
"acc_norm_stderr": 0.02292300409473685
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659393,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659393
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072867,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072867
},
"harness|truthfulqa:mc|0": {
"mc1": 0.49326805385556916,
"mc1_stderr": 0.01750191449265539,
"mc2": 0.6619082030385652,
"mc2_stderr": 0.014547333891309428
},
"harness|winogrande|5": {
"acc": 0.8484609313338595,
"acc_stderr": 0.010077698907571748
},
"harness|gsm8k|5": {
"acc": 0.755117513267627,
"acc_stderr": 0.011844819027863673
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
merve/ner-flags | ---
license: apache-2.0
---
|
Jing24/generate_sub_1 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 63954468
num_examples: 70370
download_size: 11445492
dataset_size: 63954468
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "generate_sub_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cats_vs_dogs | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- image-classification
task_ids:
- multi-class-image-classification
paperswithcode_id: cats-vs-dogs
pretty_name: Cats Vs. Dogs
dataset_info:
features:
- name: image
dtype: image
- name: labels
dtype:
class_label:
names:
'0': cat
'1': dog
splits:
- name: train
num_bytes: 3844792
num_examples: 23410
download_size: 824887076
dataset_size: 3844792
---
# Dataset Card for Cats Vs. Dogs
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Cats vs Dogs Dataset](https://www.microsoft.com/en-us/download/details.aspx?id=54765)
- **Repository:**
- **Paper:** [Asirra: A CAPTCHA that Exploits Interest-Aligned Manual Image Categorization](https://www.microsoft.com/en-us/research/wp-content/uploads/2007/10/CCS2007.pdf)
- **Leaderboard:** [Dogs vs. Cats](https://www.kaggle.com/competitions/dogs-vs-cats)
- **Point of Contact:**
### Dataset Summary
A large set of images of cats and dogs. There are 1738 corrupted images that are dropped. This dataset is part of a now-closed Kaggle competition and represents a subset of the so-called Asirra dataset.
From the competition page:
> The Asirra data set
>
> Web services are often protected with a challenge that's supposed to be easy for people to solve, but difficult for computers. Such a challenge is often called a [CAPTCHA](http://www.captcha.net/) (Completely Automated Public Turing test to tell Computers and Humans Apart) or HIP (Human Interactive Proof). HIPs are used for many purposes, such as to reduce email and blog spam and prevent brute-force attacks on web site passwords.
>
> Asirra (Animal Species Image Recognition for Restricting Access) is a HIP that works by asking users to identify photographs of cats and dogs. This task is difficult for computers, but studies have shown that people can accomplish it quickly and accurately. Many even think it's fun! Here is an example of the Asirra interface:
>
> Asirra is unique because of its partnership with [Petfinder.com](https://www.petfinder.com/), the world's largest site devoted to finding homes for homeless pets. They've provided Microsoft Research with over three million images of cats and dogs, manually classified by people at thousands of animal shelters across the United States. Kaggle is fortunate to offer a subset of this data for fun and research.
### Supported Tasks and Leaderboards
- `image-classification`: The goal of this task is to classify a given image as either containing a cat or a dog. The leaderboard is available [here](https://paperswithcode.com/sota/image-classification-on-cats-vs-dogs).
### Languages
English.
## Dataset Structure
### Data Instances
A sample from the training set is provided below:
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=500x375 at 0x29CEAD71780>,
'labels': 0
}
```
### Data Fields
The data instances have the following fields:
- `image`: A `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`.
- `labels`: an `int` classification label.
Class Label Mappings:
```
{
"cat": 0,
"dog": 1,
}
```
### Data Splits
| | train |
|---------------|------:|
| # of examples | 23410 |
## Dataset Creation
### Curation Rationale
This subset was to built to test whether computer vision algorithms can beat the Asirra CAPTCHA:
From the competition page:
> Image recognition attacks
>
> While random guessing is the easiest form of attack, various forms of image recognition can allow an attacker to make guesses that are better than random. There is enormous diversity in the photo database (a wide variety of backgrounds, angles, poses, lighting, etc.), making accurate automatic classification difficult. In an informal poll conducted many years ago, computer vision experts posited that a classifier with better than 60% accuracy would be difficult without a major advance in the state of the art. For reference, a 60% classifier improves the guessing probability of a 12-image HIP from 1/4096 to 1/459.
### Source Data
#### Initial Data Collection and Normalization
This dataset is a subset of the Asirra dataset.
From the competition page:
> Asirra is unique because of its partnership with Petfinder.com, the world's largest site devoted to finding homes for homeless pets. They've provided Microsoft Research with over three million images of cats and dogs, manually classified by people at thousands of animal shelters across the United States.
#### Who are the source language producers?
The users of [Petfinder.com](https://www.petfinder.com/).
### Annotations
#### Annotation process
The images were annotated by selecting a pet category on [Petfinder.com](https://www.petfinder.com/).
#### Who are the annotators?
The users of [Petfinder.com](https://www.petfinder.com/).
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
From the paper:
> Unlike many image-based CAPTCHAs which are abstract or subjective, Asirra’s challenges are concrete, inoffensive (cute, by some accounts), require no specialized or culturally biased knowledge, and have definite ground truth. This
makes Asirra less frustrating for humans. Some beta-testers found it fun. The four-year-old child of one asked several times to “play the cat and dog game again.”
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```bibtex
@Inproceedings (Conference){asirra-a-captcha-that-exploits-interest-aligned-manual-image-categorization,
author = {Elson, Jeremy and Douceur, John (JD) and Howell, Jon and Saul, Jared},
title = {Asirra: A CAPTCHA that Exploits Interest-Aligned Manual Image Categorization},
booktitle = {Proceedings of 14th ACM Conference on Computer and Communications Security (CCS)},
year = {2007},
month = {October},
publisher = {Association for Computing Machinery, Inc.},
url = {https://www.microsoft.com/en-us/research/publication/asirra-a-captcha-that-exploits-interest-aligned-manual-image-categorization/},
edition = {Proceedings of 14th ACM Conference on Computer and Communications Security (CCS)},
}
```
### Contributions
Thanks to [@nateraw](https://github.com/nateraw) for adding this dataset. |
CyberHarem/katarina_leagueoflegends | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of katarina (League of Legends)
This is the dataset of katarina (League of Legends), containing 500 images and their tags.
The core tags of this character are `long_hair, red_hair, breasts, green_eyes, large_breasts, scar_across_eye, scar_on_face`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 630.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katarina_leagueoflegends/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 379.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katarina_leagueoflegends/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1140 | 758.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katarina_leagueoflegends/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 567.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katarina_leagueoflegends/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1140 | 1.00 GiB | [Download](https://huggingface.co/datasets/CyberHarem/katarina_leagueoflegends/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/katarina_leagueoflegends',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, cleavage, scar, solo, gloves, midriff, navel, dagger, belt, medium_breasts, dual_wielding, jacket, sword |
| 1 | 14 |  |  |  |  |  | 1girl, nipples, navel, solo, looking_at_viewer, pussy, scar, completely_nude, smile, tattoo, red_lips, uncensored, artist_name, lipstick, parted_lips, spread_legs |
| 2 | 7 |  |  |  |  |  | 1boy, 1girl, hetero, penis, scar, navel, nipples, pussy, spread_legs, blush, solo_focus, tattoo, uncensored, rape, torn_clothes, vaginal, armor, belt, clitoris, cum, nude, one_eye_closed, open_mouth, pov, sex_from_behind, teeth |
| 3 | 6 |  |  |  |  |  | 1girl, black_bikini, cleavage, looking_at_viewer, navel, smile, solo, parted_lips, scar, water, blush, collarbone, day, bangs, beach, cloud, ocean, outdoors, sky, stomach, very_long_hair |
| 4 | 6 |  |  |  |  |  | 1boy, 1girl, hetero, solo_focus, uncensored, cum_in_mouth, nude, scar, blush, cleavage, cum_on_breasts, facial, licking_penis, tongue |
| 5 | 6 |  |  |  |  |  | 1girl, futanari, huge_penis, large_testicles, solo, thick_thighs, uncensored, veiny_penis, huge_breasts, large_penis, looking_at_viewer, arms_behind_head, arms_up, belt, erection, high_heels, lips, no_panties, black_dress, blue_eyes, boots, cleavage, covered_nipples, curvy, makeup, outdoors, standing |
| 6 | 6 |  |  |  |  |  | bare_shoulders, cleavage, fake_animal_ears, looking_at_viewer, pantyhose, playboy_bunny, rabbit_ears, black_leotard, fishnets, scar, collarbone, lipstick, rabbit_tail, smile, wrist_cuffs, 1girl, detached_collar, multiple_girls, parted_lips, red_lips, solo_focus, strapless_leotard, very_long_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | scar | solo | gloves | midriff | navel | dagger | belt | medium_breasts | dual_wielding | jacket | sword | nipples | looking_at_viewer | pussy | completely_nude | smile | tattoo | red_lips | uncensored | artist_name | lipstick | parted_lips | spread_legs | 1boy | hetero | penis | blush | solo_focus | rape | torn_clothes | vaginal | armor | clitoris | cum | nude | one_eye_closed | open_mouth | pov | sex_from_behind | teeth | black_bikini | water | collarbone | day | bangs | beach | cloud | ocean | outdoors | sky | stomach | very_long_hair | cum_in_mouth | cum_on_breasts | facial | licking_penis | tongue | futanari | huge_penis | large_testicles | thick_thighs | veiny_penis | huge_breasts | large_penis | arms_behind_head | arms_up | erection | high_heels | lips | no_panties | black_dress | blue_eyes | boots | covered_nipples | curvy | makeup | standing | bare_shoulders | fake_animal_ears | pantyhose | playboy_bunny | rabbit_ears | black_leotard | fishnets | rabbit_tail | wrist_cuffs | detached_collar | multiple_girls | strapless_leotard |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-------|:-------|:---------|:----------|:--------|:---------|:-------|:-----------------|:----------------|:---------|:--------|:----------|:--------------------|:--------|:------------------|:--------|:---------|:-----------|:-------------|:--------------|:-----------|:--------------|:--------------|:-------|:---------|:--------|:--------|:-------------|:-------|:---------------|:----------|:--------|:-----------|:------|:-------|:-----------------|:-------------|:------|:------------------|:--------|:---------------|:--------|:-------------|:------|:--------|:--------|:--------|:--------|:-----------|:------|:----------|:-----------------|:---------------|:-----------------|:---------|:----------------|:---------|:-----------|:-------------|:------------------|:---------------|:--------------|:---------------|:--------------|:-------------------|:----------|:-----------|:-------------|:-------|:-------------|:--------------|:------------|:--------|:------------------|:--------|:---------|:-----------|:-----------------|:-------------------|:------------|:----------------|:--------------|:----------------|:-----------|:--------------|:--------------|:------------------|:-----------------|:--------------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | | X | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | X | | | | X | | X | | | | | X | | X | | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | | | X | | | | | | | | X | | | X | | | | | | X | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | X | | | | | | | | | | | | | | | | | | X | | | | | X | X | | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | | X | | | | | X | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | X | X | | | | | | | | | | | | X | | | X | | X | | | X | X | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
liuyanchen1015/MULTI_VALUE_mnli_adj_postfix | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 1216115
num_examples: 5209
- name: dev_mismatched
num_bytes: 1316494
num_examples: 5441
- name: test_matched
num_bytes: 1245547
num_examples: 5343
- name: test_mismatched
num_bytes: 1326250
num_examples: 5453
- name: train
num_bytes: 49618201
num_examples: 211276
download_size: 35984149
dataset_size: 54722607
---
# Dataset Card for "MULTI_VALUE_mnli_adj_postfix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/jackal_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of jackal/ジャッカル/豺狼/자칼 (Nikke: Goddess of Victory)
This is the dataset of jackal/ジャッカル/豺狼/자칼 (Nikke: Goddess of Victory), containing 70 images and their tags.
The core tags of this character are `long_hair, breasts, multicolored_hair, hair_ornament, bangs, red_eyes, large_breasts, streaked_hair, twintails, hairclip, pink_hair, facial_mark, ahoge, white_hair, pink_eyes, side_ponytail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 70 | 119.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jackal_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 70 | 61.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jackal_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 167 | 131.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jackal_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 70 | 102.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jackal_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 167 | 198.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jackal_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jackal_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, collar, gloves, jacket, looking_at_viewer, open_mouth, solo, smile, blonde_hair, blush, hair_bow, tongue_out, upper_body, cleavage, crop_top, shirt, virtual_youtuber |
| 1 | 9 |  |  |  |  |  | cleavage, looking_at_viewer, 1girl, long_sleeves, open_mouth, pink_gloves, solo, tongue_out, midriff, navel, pink_shorts, blush, crop_top, heart, open_jacket, short_shorts, smile, virtual_youtuber, spiked_collar, asymmetrical_legwear, belt, blonde_hair, pink_thighhighs, tattoo |
| 2 | 6 |  |  |  |  |  | 1girl, pink_shorts, short_shorts, smile, solo, asymmetrical_legwear, long_sleeves, looking_at_viewer, open_mouth, shirt, single_thighhigh, squatting, thigh_strap, cleavage, open_jacket, pink_gloves, sneakers, spiked_collar, tongue_out, white_jacket, blush, crop_top, hair_bow, midriff, mole, navel, pink_belt, single_sock, thighs, very_long_hair, virtual_youtuber, white_footwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | collar | gloves | jacket | looking_at_viewer | open_mouth | solo | smile | blonde_hair | blush | hair_bow | tongue_out | upper_body | cleavage | crop_top | shirt | virtual_youtuber | long_sleeves | pink_gloves | midriff | navel | pink_shorts | heart | open_jacket | short_shorts | spiked_collar | asymmetrical_legwear | belt | pink_thighhighs | tattoo | single_thighhigh | squatting | thigh_strap | sneakers | white_jacket | mole | pink_belt | single_sock | thighs | very_long_hair | white_footwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:---------|:---------|:--------------------|:-------------|:-------|:--------|:--------------|:--------|:-----------|:-------------|:-------------|:-----------|:-----------|:--------|:-------------------|:---------------|:--------------|:----------|:--------|:--------------|:--------|:--------------|:---------------|:----------------|:-----------------------|:-------|:------------------|:---------|:-------------------|:------------|:--------------|:-----------|:---------------|:-------|:------------|:--------------|:---------|:-----------------|:-----------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | | | | X | X | X | X | X | X | | X | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | | | X | X | X | X | | X | X | X | | X | X | X | X | X | X | X | X | X | | X | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | X |
|
tyzhu/find_sent_before_sent_train_100_eval_40 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 777978
num_examples: 644
- name: validation
num_bytes: 223538
num_examples: 202
download_size: 273207
dataset_size: 1001516
---
# Dataset Card for "find_sent_before_sent_train_100_eval_40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.