datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
316usman/thematic3aembed-part2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: thematic
dtype: string
- name: sub-thematic
dtype: string
- name: country
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
- name: category
dtype: string
- name: keep
dtype: bool
splits:
- name: train
num_bytes: 151436612
num_examples: 207955
download_size: 45367638
dataset_size: 151436612
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
li-ping/all_pdf_dataset_1203_v1 | ---
dataset_info:
features:
- name: set
struct:
- name: neg
sequence: string
- name: pos
sequence: string
- name: query
dtype: string
splits:
- name: train
num_bytes: 9198
num_examples: 2
download_size: 34872
dataset_size: 9198
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
J2005dotcom/Bone-isms | ---
license: apache-2.0
task_categories:
- conversational
- question-answering
language:
- en
--- |
AdapterOcean/med_alpaca_standardized_cluster_92_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 16673432
num_examples: 11883
download_size: 8416909
dataset_size: 16673432
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_92_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sambanovasystems/x-self-instruct-seed-32 | ---
license: apache-2.0
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: ar
num_bytes: 3010
num_examples: 32
- name: en
num_bytes: 2145
num_examples: 32
- name: es
num_bytes: 2474
num_examples: 32
- name: fr
num_bytes: 2493
num_examples: 32
- name: hi
num_bytes: 5114
num_examples: 32
- name: zh
num_bytes: 1910
num_examples: 32
download_size: 18710
dataset_size: 17146
task_categories:
- conversational
language:
- ar
- es
- en
- hi
- fr
- zh
size_categories:
- n<1K
---
# Dataset Card for xOA22 - Multilingual Prompts from OpenAssistant
### Dataset Summary
x-self-instruct-seed-32 consists of 32 prompts chosen out of the 252 prompts in the [self-instruct-seed](https://huggingface.co/datasets/HuggingFaceH4/self-instruct-seed) dataset from the [Self-Instruct](https://arxiv.org/pdf/2212.10560.pdf) paper. These 32 prompts were filtered out according to the following criteria:
- Should be natural in a chat setting
- Therefore, we filter out any prompts with "few-shot examples", as these are all instruction prompts that we consider unnatural in a chat setting
- Should be well-written and easily understood
- Our intention is to use the prompts as-is, without modification, in order to maintain parity with any other experiments that use this dataset
- However, we planned to translate the prompts into multiple languages, and poorly written or confusing prompts could lead to high variance in the resulting translations
- Avoid asking for code / domain specific languages
- Responses in code or domain specific languages defeat the purpose of multilingual evaluation
- Avoid potentially simple numerical responses
- These responses would likely be the same in every language and aren't good measures of multilingual ability
- Avoid requests for translation
- A good response will always be in the same language, so these prompts defeat the purpose of translating prompts into multiple languages
- Avoid prompts that may be difficult to translate / use English-specific language constructs
- Prompts that rely on English constructs such as puns, dad jokes, or witty proverbs may not translate well to other languages
- Some concepts or pop culture references may be culture-specific and difficult to translate to other languages, e.g. knowledge about American celebrities
- Avoid duplicate prompts / prompts that are too similar
The prompts were then manually translated by volunteers into 5 languages: Arabic, Simplified Chinese, French, Hindi and Spanish.
This dataset was originally curated for use in human evaluations of the multilingual abilities of [BLOOMChat](https://huggingface.co/sambanovasystems/BLOOMChat-176B-v1). Since not all prompts could be directly translatable due to cultural and linguistic differences, volunteers were encouraged to make appropriate substitutions and modifications that would maintain the intent of the original English prompt. We make note of any major departures from the original English prompts below.
### Languages
- Arabic (ar)
- English (en)
- Spanish (es)
- French (fr)
- Hindi (hi)
- Chinese (zh)
## Dataset Structure
### Data Fields
- `prompt`: manually translated prompt text. The English split is un-modified from the OpenAssistant Converstaions paper.
### Data Splits
The x-self-instruct-seed-32 dataset has 6 splits, one for each language. Below are the statistics for each split
| Dataset Split | Number of Instances in Split |
| ------------- | ---------------------------- |
| ar | 32 |
| en | 32 |
| es | 32 |
| fr | 32 |
| hi | 32 |
| zh | 32 |
### Translation Notes
Below are notes from volunteer translators.
- Arabic
- No translation notes
- Spanish
- No translation notes
- French
- Prompt 20: Not sure how to directly translate “come up with” in this context. Using “provide” instead.
- Hindi
- Prompt 12: Changed Seattle city to a famous destination in India called Manali
- Chinese
- Prompt 20: Drop funny answer to make it more natural in Chinese
- Prompt 24: Rephrase a little bit to be “use humor to overcome embarrassment” to make it more natural in Chinese
### Curation Rationale
These prompts were originally curated in order to test the multilingual abilities of the BLOOMChat model. The model's responses to these translated prompts were rated on their quality in a chat setting. Therefore, emphasis was placed on making translations as natural and understandable as possible to native speakers, and we accepted feedback and modifications to the prompts from our volunteers.
### Dataset Curators
TBA
### Contributions
TBA
### Source Data
https://huggingface.co/datasets/HuggingFaceH4/self-instruct-seed |
developerZoyal/ZoyelMeetings | ---
license: apache-2.0
---
|
Coletomyo/tomyo-whisper | ---
license: mit
---
|
WenhaoWang/PE-ICD | ---
license: mit
viewer: false
---
The space to store the data and trained models of our paper "Pattern-Expandable Image Copy Detection".
The code is available at: https://github.com/WangWenhao0716/PEICD
# Citation
```
@inproceedings{
wang2024peicd,
title={Pattern-Expandable Image Copy Detection},
author={Wang Wenhao and Sun Yifan and Yang Yi},
booktitle={In submission},
year={2024},
}
``` |
philschmid/guanaco-sharegpt-style | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 13979844
num_examples: 9033
download_size: 8238076
dataset_size: 13979844
---
# Dataset Card for "guanaco-sharegpt-style"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vira-chatbot/vira-intents-live | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 549293
num_examples: 7632
- name: validation
num_bytes: 235916
num_examples: 3272
download_size: 357988
dataset_size: 785209
---
# Dataset Card for "vira-intents-live"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuqi6777/RankGPT-msmarco-100k-clean | ---
license: mit
---
|
Poupou/Gitcoin-ODS-Hackhaton-GR15 | ---
annotations_creators:
- no-annotation
language:
- en
language_creators:
- expert-generated
license:
- mit
multilinguality:
- monolingual
pretty_name: Gitcoin FDD Open Data Science Hackathon GR15
size_categories:
- 1M<n<10M
source_datasets:
- original
tags:
- Gitcoin
- Gitcoin Grants
- Sybil
- Sybil Slayers
- FDD
- Web3
- Public Goods
- Fraud Detection
- DAO
- Ethereum
- Polygon
task_categories:
- feature-extraction
task_ids: []
---
# Dataset Card for [Gitcoin ODS Hackathon GR15]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Creation](#dataset-creation)
- [Source Data](#source-data)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://gitcoin.co/issue/29389
- **Repository:** https://github.com/poupou-web3/GC-ODS-Sybil
- **Point of Contact:** https://discord.com/channels/562828676480237578/1024788324826763284
### Dataset Summary
This data set was created in the context of the first [Gitcoin Open Data Science Hackathon](https://go.gitcoin.co/blog/open-data-science-hackathon).
It contains all the transactions on the Ethereum and Polygon chains of the wallet that contributed to the Grant 15 of Gitcoin grants program.
It was created in order to find patterns in the transactions of potential Sybil attackers by exploring their on-chain activity.
## Dataset Creation
### Source Data
The wallet address from grant 15 was extracted from the data put together by the Gitcoin DAO. [GR_15_DATA](https://drive.google.com/drive/folders/17OdrV7SA0I56aDMwqxB6jMwoY3tjSf5w)
The data was produced using [Etherscan API](https://etherscan.io/) and [PolygonScan API](https://polygonscan.com/) and using scripts available later at [repo](https://github.com/poupou-web3/GC-ODS-Sybil).
An address contributing to the [GR_15_DATA](https://drive.google.com/drive/folders/17OdrV7SA0I56aDMwqxB6jMwoY3tjSf5w) with no found transaction on a chain will not appear in the data gathered.
** Careful the transaction data only contains "normal" transactions as described by the API provider.**
## Dataset Structure
### Data Instances
There are 4 CSV files.
- 2 for transactions: one for the Ethereum transactions and one for the Polygon transactions.
- 2 for features: one for the Ethereum transactions and one for the Polygon transactions.
### Data Fields
As provided by the [Etherscan API](https://etherscan.io/) and [PolygonScan API](https://polygonscan.com/).
A column address was added for easier manipulation and to have all the transactions of all addresses in the same file.
It is an unsupervised machine-learning task, there is no target column.
Most of the extracted features have been extracted using [tsfresh](https://tsfresh.readthedocs.io/en/latest/). The code is available in the GitHub [repo](https://github.com/poupou-web3/GC-ODS-Sybil). It allows reproducing the extraction from the 2 transactions CSV. Column names are named by tsfresh, each feature can be found in the documentation for more detailed definitions. Following are the descriptions for features not explained in by tsfresh:
- countUniqueInteracted : Count the number of unique addresses with which the wallet address has interacted.
- countTx: The total number of transactions
- ratioUniqueInteracted : countUniqueInteracted / countTx
- outgoing: Number of outgoing transactions
- outgoingRatio : outgoing / countTx
## Considerations for Using the Data
### Social Impact of Dataset
The creation of the data set may help in fraud detection and defence in public goods funding.
## Additional Information
### Licensing Information
MIT
### Citation Information
Please cite this data set if you use it, especially in the hackathon context.
### Contributions
Thanks to [@poupou-web3](https://github.com/poupou-web3) for adding this dataset. |
kreimben/leetcode_user_submissions | ---
license: mit
dataset_info:
features:
- name: title_slug
dtype: string
- name: question_content
dtype: string
- name: tag
dtype: string
- name: level
dtype: string
- name: question_hints
dtype: string
- name: view_count
dtype: int64
- name: vote_count
dtype: int64
- name: content
dtype: string
splits:
- name: train
num_bytes: 230803185
num_examples: 109309
download_size: 66031952
dataset_size: 230803185
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
peeper/blip-preprocessed | ---
dataset_info:
features:
- name: labels
sequence: int64
- name: pixel_values
sequence:
sequence:
sequence: float32
splits:
- name: train
num_bytes: 7522975512
num_examples: 4238
- name: test
num_bytes: 2508250212
num_examples: 1413
download_size: 2847165063
dataset_size: 10031225724
---
# Dataset Card for "blip-preprocessed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jlbaker361/flickr_humans_10k | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: src
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 4016479260.0
num_examples: 10000
download_size: 4013536545
dataset_size: 4016479260.0
---
# Dataset Card for "flickr_humans_10k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Hemanth-Sai/Sentiments | ---
license: apache-2.0
task_categories:
- text-classification
language:
- sa
size_categories:
- 1K<n<10K
--- |
tyzhu/find_marker_both_sent_train_400_eval_40_no_permute | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 5973002.880726015
num_examples: 4188
- name: validation
num_bytes: 220570
num_examples: 200
download_size: 983246
dataset_size: 6193572.880726015
---
# Dataset Card for "find_marker_both_sent_train_400_eval_40_no_permute"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
reflecticai/therapyconvo | ---
license: apache-2.0
---
|
dadsqwe/clash | ---
license: other
---
|
Hunterlige/code_civil | ---
license: mit
language:
- fr
multilinguality:
- monolingual
source_datasets:
- original
pretty_name: Code civil
---
|
allenai/scifact | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- found
license:
- cc-by-nc-2.0
multilinguality:
- monolingual
pretty_name: SciFact
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- fact-checking
paperswithcode_id: scifact
dataset_info:
- config_name: corpus
features:
- name: doc_id
dtype: int32
- name: title
dtype: string
- name: abstract
sequence: string
- name: structured
dtype: bool
splits:
- name: train
num_bytes: 7993572
num_examples: 5183
download_size: 3115079
dataset_size: 7993572
- config_name: claims
features:
- name: id
dtype: int32
- name: claim
dtype: string
- name: evidence_doc_id
dtype: string
- name: evidence_label
dtype: string
- name: evidence_sentences
sequence: int32
- name: cited_doc_ids
sequence: int32
splits:
- name: train
num_bytes: 168627
num_examples: 1261
- name: test
num_bytes: 33625
num_examples: 300
- name: validation
num_bytes: 60360
num_examples: 450
download_size: 3115079
dataset_size: 262612
---
# Dataset Card for "scifact"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://scifact.apps.allenai.org/](https://scifact.apps.allenai.org/)
- **Repository:** https://github.com/allenai/scifact
- **Paper:** [Fact or Fiction: Verifying Scientific Claims](https://aclanthology.org/2020.emnlp-main.609/)
- **Point of Contact:** [David Wadden](mailto:davidw@allenai.org)
- **Size of downloaded dataset files:** 6.23 MB
- **Size of the generated dataset:** 8.26 MB
- **Total amount of disk used:** 14.49 MB
### Dataset Summary
SciFact, a dataset of 1.4K expert-written scientific claims paired with evidence-containing abstracts, and annotated with labels and rationales.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### claims
- **Size of downloaded dataset files:** 3.12 MB
- **Size of the generated dataset:** 262.61 kB
- **Total amount of disk used:** 3.38 MB
An example of 'validation' looks as follows.
```
{
"cited_doc_ids": [14717500],
"claim": "1,000 genomes project enables mapping of genetic sequence variation consisting of rare variants with larger penetrance effects than common variants.",
"evidence_doc_id": "14717500",
"evidence_label": "SUPPORT",
"evidence_sentences": [2, 5],
"id": 3
}
```
#### corpus
- **Size of downloaded dataset files:** 3.12 MB
- **Size of the generated dataset:** 7.99 MB
- **Total amount of disk used:** 11.11 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"abstract": "[\"Alterations of the architecture of cerebral white matter in the developing human brain can affect cortical development and res...",
"doc_id": 4983,
"structured": false,
"title": "Microstructural development of human newborn cerebral white matter assessed in vivo by diffusion tensor magnetic resonance imaging."
}
```
### Data Fields
The data fields are the same among all splits.
#### claims
- `id`: a `int32` feature.
- `claim`: a `string` feature.
- `evidence_doc_id`: a `string` feature.
- `evidence_label`: a `string` feature.
- `evidence_sentences`: a `list` of `int32` features.
- `cited_doc_ids`: a `list` of `int32` features.
#### corpus
- `doc_id`: a `int32` feature.
- `title`: a `string` feature.
- `abstract`: a `list` of `string` features.
- `structured`: a `bool` feature.
### Data Splits
#### claims
| |train|validation|test|
|------|----:|---------:|---:|
|claims| 1261| 450| 300|
#### corpus
| |train|
|------|----:|
|corpus| 5183|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
https://github.com/allenai/scifact/blob/master/LICENSE.md
The SciFact dataset is released under the [CC BY-NC 2.0](https://creativecommons.org/licenses/by-nc/2.0/). By using the SciFact data, you are agreeing to its usage terms.
### Citation Information
```
@inproceedings{wadden-etal-2020-fact,
title = "Fact or Fiction: Verifying Scientific Claims",
author = "Wadden, David and
Lin, Shanchuan and
Lo, Kyle and
Wang, Lucy Lu and
van Zuylen, Madeleine and
Cohan, Arman and
Hajishirzi, Hannaneh",
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.emnlp-main.609",
doi = "10.18653/v1/2020.emnlp-main.609",
pages = "7534--7550",
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@lhoestq](https://github.com/lhoestq), [@dwadden](https://github.com/dwadden), [@patrickvonplaten](https://github.com/patrickvonplaten), [@mariamabarham](https://github.com/mariamabarham), [@lewtun](https://github.com/lewtun) for adding this dataset. |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_12 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 996086856.0
num_examples: 195618
download_size: 1016163774
dataset_size: 996086856.0
---
# Dataset Card for "chunk_12"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_upaya07__Birbal-7B-V1 | ---
pretty_name: Evaluation run of upaya07/Birbal-7B-V1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [upaya07/Birbal-7B-V1](https://huggingface.co/upaya07/Birbal-7B-V1) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_upaya07__Birbal-7B-V1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-19T05:40:57.697010](https://huggingface.co/datasets/open-llm-leaderboard/details_upaya07__Birbal-7B-V1/blob/main/results_2023-12-19T05-40-57.697010.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6338717978820942,\n\
\ \"acc_stderr\": 0.032354410720897495,\n \"acc_norm\": 0.6393367450479324,\n\
\ \"acc_norm_stderr\": 0.033002421961828204,\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4534206690460975,\n\
\ \"mc2_stderr\": 0.014385152704042822\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5802047781569966,\n \"acc_stderr\": 0.014422181226303028,\n\
\ \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844465\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6511651065524796,\n\
\ \"acc_stderr\": 0.004756275875018264,\n \"acc_norm\": 0.8483369846644094,\n\
\ \"acc_norm_stderr\": 0.003579608743506612\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361073,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361073\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.0437588849272706,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.0437588849272706\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959215,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647078,\n\
\ \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647078\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399313,\n \"\
acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399313\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5787037037037037,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n\
\ \"acc_stderr\": 0.029918586707798827,\n \"acc_norm\": 0.726457399103139,\n\
\ \"acc_norm_stderr\": 0.029918586707798827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098825,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098825\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7931034482758621,\n\
\ \"acc_stderr\": 0.01448565604166918,\n \"acc_norm\": 0.7931034482758621,\n\
\ \"acc_norm_stderr\": 0.01448565604166918\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500107,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500107\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37318435754189944,\n\
\ \"acc_stderr\": 0.016175692013381968,\n \"acc_norm\": 0.37318435754189944,\n\
\ \"acc_norm_stderr\": 0.016175692013381968\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826514,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826514\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495036,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495036\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49022164276401564,\n\
\ \"acc_stderr\": 0.012767793787729336,\n \"acc_norm\": 0.49022164276401564,\n\
\ \"acc_norm_stderr\": 0.012767793787729336\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6879084967320261,\n \"acc_stderr\": 0.018745011201277657,\n \
\ \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.018745011201277657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160872,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160872\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4534206690460975,\n\
\ \"mc2_stderr\": 0.014385152704042822\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7876874506708761,\n \"acc_stderr\": 0.011493384687249789\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4025777103866566,\n \
\ \"acc_stderr\": 0.013508523063663435\n }\n}\n```"
repo_url: https://huggingface.co/upaya07/Birbal-7B-V1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|arc:challenge|25_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|arc:challenge|25_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|gsm8k|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|gsm8k|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hellaswag|10_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hellaswag|10_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T19-22-58.191113.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-19T05-40-57.697010.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-19T05-40-57.697010.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- '**/details_harness|winogrande|5_2023-12-18T19-22-58.191113.parquet'
- split: 2023_12_19T05_40_57.697010
path:
- '**/details_harness|winogrande|5_2023-12-19T05-40-57.697010.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-19T05-40-57.697010.parquet'
- config_name: results
data_files:
- split: 2023_12_18T19_22_58.191113
path:
- results_2023-12-18T19-22-58.191113.parquet
- split: 2023_12_19T05_40_57.697010
path:
- results_2023-12-19T05-40-57.697010.parquet
- split: latest
path:
- results_2023-12-19T05-40-57.697010.parquet
---
# Dataset Card for Evaluation run of upaya07/Birbal-7B-V1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [upaya07/Birbal-7B-V1](https://huggingface.co/upaya07/Birbal-7B-V1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_upaya07__Birbal-7B-V1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-19T05:40:57.697010](https://huggingface.co/datasets/open-llm-leaderboard/details_upaya07__Birbal-7B-V1/blob/main/results_2023-12-19T05-40-57.697010.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6338717978820942,
"acc_stderr": 0.032354410720897495,
"acc_norm": 0.6393367450479324,
"acc_norm_stderr": 0.033002421961828204,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4534206690460975,
"mc2_stderr": 0.014385152704042822
},
"harness|arc:challenge|25": {
"acc": 0.5802047781569966,
"acc_stderr": 0.014422181226303028,
"acc_norm": 0.6279863481228669,
"acc_norm_stderr": 0.014124597881844465
},
"harness|hellaswag|10": {
"acc": 0.6511651065524796,
"acc_stderr": 0.004756275875018264,
"acc_norm": 0.8483369846644094,
"acc_norm_stderr": 0.003579608743506612
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03583496176361073,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03583496176361073
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.0437588849272706,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.0437588849272706
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647078,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647078
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.016595259710399313,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.016595259710399313
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.726457399103139,
"acc_stderr": 0.029918586707798827,
"acc_norm": 0.726457399103139,
"acc_norm_stderr": 0.029918586707798827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098825,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098825
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459754,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459754
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7931034482758621,
"acc_stderr": 0.01448565604166918,
"acc_norm": 0.7931034482758621,
"acc_norm_stderr": 0.01448565604166918
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500107,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500107
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37318435754189944,
"acc_stderr": 0.016175692013381968,
"acc_norm": 0.37318435754189944,
"acc_norm_stderr": 0.016175692013381968
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826514,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826514
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495036,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495036
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49022164276401564,
"acc_stderr": 0.012767793787729336,
"acc_norm": 0.49022164276401564,
"acc_norm_stderr": 0.012767793787729336
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6879084967320261,
"acc_stderr": 0.018745011201277657,
"acc_norm": 0.6879084967320261,
"acc_norm_stderr": 0.018745011201277657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160872,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160872
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4534206690460975,
"mc2_stderr": 0.014385152704042822
},
"harness|winogrande|5": {
"acc": 0.7876874506708761,
"acc_stderr": 0.011493384687249789
},
"harness|gsm8k|5": {
"acc": 0.4025777103866566,
"acc_stderr": 0.013508523063663435
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_ConvexAI__Seraphim-8x10.7B-bf16 | ---
pretty_name: Evaluation run of ConvexAI/Seraphim-8x10.7B-bf16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ConvexAI/Seraphim-8x10.7B-bf16](https://huggingface.co/ConvexAI/Seraphim-8x10.7B-bf16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ConvexAI__Seraphim-8x10.7B-bf16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T12:17:24.179405](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Seraphim-8x10.7B-bf16/blob/main/results_2024-01-21T12-17-24.179405.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6652967970541726,\n\
\ \"acc_stderr\": 0.03151994892831824,\n \"acc_norm\": 0.6662016910120943,\n\
\ \"acc_norm_stderr\": 0.0321599041095528,\n \"mc1\": 0.5618115055079559,\n\
\ \"mc1_stderr\": 0.017369236164404417,\n \"mc2\": 0.7077444338481541,\n\
\ \"mc2_stderr\": 0.01511580206193018\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6808873720136519,\n \"acc_stderr\": 0.013621696119173307,\n\
\ \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520764\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7123083051185023,\n\
\ \"acc_stderr\": 0.004517614647703243,\n \"acc_norm\": 0.8871738697470624,\n\
\ \"acc_norm_stderr\": 0.0031573355082588515\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361073,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361073\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n\
\ \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n\
\ \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.6042553191489362,\n \"acc_stderr\": 0.03196758697835363,\n \"\
acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.03196758697835363\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130733,\n \"\
acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130733\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n\
\ \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n\
\ \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8838383838383839,\n \"acc_stderr\": 0.022828881775249377,\n \"\
acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.022828881775249377\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097113,\n \
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097113\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.029597329730978086,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.029597329730978086\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092448,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092448\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801584,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801584\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \
\ \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077823,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077823\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n\
\ \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n\
\ \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n\
\ \"acc_stderr\": 0.016104833880142295,\n \"acc_norm\": 0.3653631284916201,\n\
\ \"acc_norm_stderr\": 0.016104833880142295\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n\
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n\
\ \"acc_stderr\": 0.025218040373410622,\n \"acc_norm\": 0.729903536977492,\n\
\ \"acc_norm_stderr\": 0.025218040373410622\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.02301670564026219,\n\
\ \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.02301670564026219\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5283687943262412,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4895697522816167,\n\
\ \"acc_stderr\": 0.012767457253930647,\n \"acc_norm\": 0.4895697522816167,\n\
\ \"acc_norm_stderr\": 0.012767457253930647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.026431329870789527,\n\
\ \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.026431329870789527\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6977124183006536,\n \"acc_stderr\": 0.018579232711113884,\n \
\ \"acc_norm\": 0.6977124183006536,\n \"acc_norm_stderr\": 0.018579232711113884\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744546,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744546\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\
\ \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n\
\ \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5618115055079559,\n\
\ \"mc1_stderr\": 0.017369236164404417,\n \"mc2\": 0.7077444338481541,\n\
\ \"mc2_stderr\": 0.01511580206193018\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.010370455551343333\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.643669446550417,\n \
\ \"acc_stderr\": 0.013191685031357456\n }\n}\n```"
repo_url: https://huggingface.co/ConvexAI/Seraphim-8x10.7B-bf16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|arc:challenge|25_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|arc:challenge|25_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|gsm8k|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|gsm8k|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hellaswag|10_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hellaswag|10_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T22-34-11.436862.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T12-17-24.179405.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T12-17-24.179405.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- '**/details_harness|winogrande|5_2024-01-20T22-34-11.436862.parquet'
- split: 2024_01_21T12_17_24.179405
path:
- '**/details_harness|winogrande|5_2024-01-21T12-17-24.179405.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T12-17-24.179405.parquet'
- config_name: results
data_files:
- split: 2024_01_20T22_34_11.436862
path:
- results_2024-01-20T22-34-11.436862.parquet
- split: 2024_01_21T12_17_24.179405
path:
- results_2024-01-21T12-17-24.179405.parquet
- split: latest
path:
- results_2024-01-21T12-17-24.179405.parquet
---
# Dataset Card for Evaluation run of ConvexAI/Seraphim-8x10.7B-bf16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ConvexAI/Seraphim-8x10.7B-bf16](https://huggingface.co/ConvexAI/Seraphim-8x10.7B-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ConvexAI__Seraphim-8x10.7B-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T12:17:24.179405](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Seraphim-8x10.7B-bf16/blob/main/results_2024-01-21T12-17-24.179405.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6652967970541726,
"acc_stderr": 0.03151994892831824,
"acc_norm": 0.6662016910120943,
"acc_norm_stderr": 0.0321599041095528,
"mc1": 0.5618115055079559,
"mc1_stderr": 0.017369236164404417,
"mc2": 0.7077444338481541,
"mc2_stderr": 0.01511580206193018
},
"harness|arc:challenge|25": {
"acc": 0.6808873720136519,
"acc_stderr": 0.013621696119173307,
"acc_norm": 0.7098976109215017,
"acc_norm_stderr": 0.013261573677520764
},
"harness|hellaswag|10": {
"acc": 0.7123083051185023,
"acc_stderr": 0.004517614647703243,
"acc_norm": 0.8871738697470624,
"acc_norm_stderr": 0.0031573355082588515
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03583496176361073,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03583496176361073
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.03196758697835363,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.03196758697835363
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47883597883597884,
"acc_stderr": 0.025728230952130733,
"acc_norm": 0.47883597883597884,
"acc_norm_stderr": 0.025728230952130733
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8838383838383839,
"acc_stderr": 0.022828881775249377,
"acc_norm": 0.8838383838383839,
"acc_norm_stderr": 0.022828881775249377
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097113,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097113
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.029597329730978086,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.029597329730978086
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092448,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092448
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801584,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801584
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.022535526352692705,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.022535526352692705
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077823,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077823
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424383,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424383
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.016104833880142295,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.016104833880142295
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.025218040373410622,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.025218040373410622
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.02301670564026219,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.02301670564026219
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4895697522816167,
"acc_stderr": 0.012767457253930647,
"acc_norm": 0.4895697522816167,
"acc_norm_stderr": 0.012767457253930647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7463235294117647,
"acc_stderr": 0.026431329870789527,
"acc_norm": 0.7463235294117647,
"acc_norm_stderr": 0.026431329870789527
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6977124183006536,
"acc_stderr": 0.018579232711113884,
"acc_norm": 0.6977124183006536,
"acc_norm_stderr": 0.018579232711113884
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744546,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744546
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5618115055079559,
"mc1_stderr": 0.017369236164404417,
"mc2": 0.7077444338481541,
"mc2_stderr": 0.01511580206193018
},
"harness|winogrande|5": {
"acc": 0.8374112075769534,
"acc_stderr": 0.010370455551343333
},
"harness|gsm8k|5": {
"acc": 0.643669446550417,
"acc_stderr": 0.013191685031357456
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
nlplabtdtu/summarization_sft | ---
dataset_info:
features:
- name: id
dtype: int64
- name: content
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 3804093
num_examples: 1000
- name: test
num_bytes: 770548
num_examples: 200
download_size: 2233195
dataset_size: 4574641
---
# Dataset Card for "tdtunlplab_news_summary_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_mergedlm__zephyrnotus-11b-alpha | ---
pretty_name: Evaluation run of mergedlm/zephyrnotus-11b-alpha
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mergedlm/zephyrnotus-11b-alpha](https://huggingface.co/mergedlm/zephyrnotus-11b-alpha)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mergedlm__zephyrnotus-11b-alpha\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T18:58:32.292259](https://huggingface.co/datasets/open-llm-leaderboard/details_mergedlm__zephyrnotus-11b-alpha/blob/main/results_2023-12-04T18-58-32.292259.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6022156893041962,\n\
\ \"acc_stderr\": 0.03336144237047965,\n \"acc_norm\": 0.6105212360689912,\n\
\ \"acc_norm_stderr\": 0.03409373060010593,\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5721680885718956,\n\
\ \"mc2_stderr\": 0.015636158796667236\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5853242320819113,\n \"acc_stderr\": 0.014397070564409174,\n\
\ \"acc_norm\": 0.613481228668942,\n \"acc_norm_stderr\": 0.014230084761910478\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6352320254929297,\n\
\ \"acc_stderr\": 0.004803812631994954,\n \"acc_norm\": 0.8280223063134834,\n\
\ \"acc_norm_stderr\": 0.003765898364938865\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.029146904747798335,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.029146904747798335\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835772,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835772\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.04161808503501531,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.04161808503501531\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851116,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851116\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"\
acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.03567969772268049,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.03567969772268049\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365907,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365907\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n\
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7908256880733945,\n \"acc_stderr\": 0.017437937173343233,\n \"\
acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.017437937173343233\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\"\
: 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n\
\ \"acc_stderr\": 0.029771775228145628,\n \"acc_norm\": 0.7647058823529411,\n\
\ \"acc_norm_stderr\": 0.029771775228145628\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7046413502109705,\n \"acc_stderr\": 0.029696338713422876,\n\
\ \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.029696338713422876\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n\
\ \"acc_stderr\": 0.015075523238101077,\n \"acc_norm\": 0.768837803320562,\n\
\ \"acc_norm_stderr\": 0.015075523238101077\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688218,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688218\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31731843575418994,\n\
\ \"acc_stderr\": 0.015566392630057031,\n \"acc_norm\": 0.31731843575418994,\n\
\ \"acc_norm_stderr\": 0.015566392630057031\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615693,\n\
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.02666441088693762,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.02666441088693762\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.424380704041721,\n\
\ \"acc_stderr\": 0.01262334375743002,\n \"acc_norm\": 0.424380704041721,\n\
\ \"acc_norm_stderr\": 0.01262334375743002\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n\
\ \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6160130718954249,\n \"acc_stderr\": 0.019675808135281508,\n \
\ \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.019675808135281508\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.030021056238440303,\n\
\ \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.030021056238440303\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768924,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768924\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5721680885718956,\n\
\ \"mc2_stderr\": 0.015636158796667236\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275625\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17134192570128887,\n \
\ \"acc_stderr\": 0.010379150273178357\n }\n}\n```"
repo_url: https://huggingface.co/mergedlm/zephyrnotus-11b-alpha
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|arc:challenge|25_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|gsm8k|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hellaswag|10_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T18-58-32.292259.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T18-58-32.292259.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- '**/details_harness|winogrande|5_2023-12-04T18-58-32.292259.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T18-58-32.292259.parquet'
- config_name: results
data_files:
- split: 2023_12_04T18_58_32.292259
path:
- results_2023-12-04T18-58-32.292259.parquet
- split: latest
path:
- results_2023-12-04T18-58-32.292259.parquet
---
# Dataset Card for Evaluation run of mergedlm/zephyrnotus-11b-alpha
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mergedlm/zephyrnotus-11b-alpha
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mergedlm/zephyrnotus-11b-alpha](https://huggingface.co/mergedlm/zephyrnotus-11b-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mergedlm__zephyrnotus-11b-alpha",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T18:58:32.292259](https://huggingface.co/datasets/open-llm-leaderboard/details_mergedlm__zephyrnotus-11b-alpha/blob/main/results_2023-12-04T18-58-32.292259.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6022156893041962,
"acc_stderr": 0.03336144237047965,
"acc_norm": 0.6105212360689912,
"acc_norm_stderr": 0.03409373060010593,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5721680885718956,
"mc2_stderr": 0.015636158796667236
},
"harness|arc:challenge|25": {
"acc": 0.5853242320819113,
"acc_stderr": 0.014397070564409174,
"acc_norm": 0.613481228668942,
"acc_norm_stderr": 0.014230084761910478
},
"harness|hellaswag|10": {
"acc": 0.6352320254929297,
"acc_stderr": 0.004803812631994954,
"acc_norm": 0.8280223063134834,
"acc_norm_stderr": 0.003765898364938865
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.029146904747798335,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.029146904747798335
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835772,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835772
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.04161808503501531,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.04161808503501531
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851116,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851116
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.03567969772268049,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.03567969772268049
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365907,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365907
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187208,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187208
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524572,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7908256880733945,
"acc_stderr": 0.017437937173343233,
"acc_norm": 0.7908256880733945,
"acc_norm_stderr": 0.017437937173343233
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.029696338713422876,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.029696338713422876
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292406,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292406
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.768837803320562,
"acc_stderr": 0.015075523238101077,
"acc_norm": 0.768837803320562,
"acc_norm_stderr": 0.015075523238101077
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688218,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688218
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31731843575418994,
"acc_stderr": 0.015566392630057031,
"acc_norm": 0.31731843575418994,
"acc_norm_stderr": 0.015566392630057031
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.026925654653615693,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.026925654653615693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.02666441088693762,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.02666441088693762
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700187,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.424380704041721,
"acc_stderr": 0.01262334375743002,
"acc_norm": 0.424380704041721,
"acc_norm_stderr": 0.01262334375743002
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.019675808135281508,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.019675808135281508
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.030021056238440303,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.030021056238440303
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768924,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768924
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5721680885718956,
"mc2_stderr": 0.015636158796667236
},
"harness|winogrande|5": {
"acc": 0.7640094711917916,
"acc_stderr": 0.011933828850275625
},
"harness|gsm8k|5": {
"acc": 0.17134192570128887,
"acc_stderr": 0.010379150273178357
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
africanbuffalo/chungwa | ---
license: mit
---
|
EgilKarlsen/AA | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: log
dtype: string
- name: label
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 6352006
num_examples: 24320
- name: test
num_bytes: 1813856
num_examples: 6948
- name: validation
num_bytes: 909250
num_examples: 3475
download_size: 2288707
dataset_size: 9075112
---
# Dataset Card for "AA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Zenodia/dreambooth-mooncake | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 7535176.0
num_examples: 15
download_size: 7499175
dataset_size: 7535176.0
---
# Dataset Card for "dreambooth-mooncake"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lewiswatson/YarraEsrever | ---
license: afl-3.0
task_categories:
- translation
tags:
- math
- seq2seq
size_categories:
- 100K<n<1M
---
**Introducing YarraEsrever:** The Ultimate Integer Array Reversal Dataset for Cutting-Edge AI Research
Prepare to revolutionise your machine learning workflows with YarraEsrever, a groundbreaking dataset that pushes the boundaries of integer array reversal. This state-of-the-art collection boasts an impressive 1,000,000 unique supervised training pairs, meticulously curated to empower researchers and data scientists in their quest to master the art of reversing integer arrays.
Don't waste your time with boring, non-AI projects that won't get you featured on TechCrunch. Jump on the Machine Learning bandwagon and watch as your research gains instant credibility and your funding reaches stratospheric heights. After all, if it has AI in the name, it must be groundbreaking, right?
Train with YarraEsrever today and be part of the AI revolution that will change the world, one reversed integer array at a time. Who needs real-world applications when you have buzz words and hype? Get YarraEsrever now before your competitors beat you to it and leave you in the dust of their AI-powered success!
*Disclaimer: YarraEsrever may or may not actually contain any real AI, but we guarantee it will make your research sound cutting-edge and help you secure that sweet, sweet funding. Results may vary, and we take no responsibility for any incorrectly reversed integers that may lead to the collapse of the space-time continuum. |
nicholasKluge/toxic-aira-dataset | ---
language:
- pt
- en
license: apache-2.0
size_categories:
- 10K<n<100K
task_categories:
- text-classification
pretty_name: Toxic-Aira Dataset
tags:
- toxicity
- harm
dataset_info:
features:
- name: instruction
dtype: string
- name: chosen_response
dtype: string
- name: rejected_response
dtype: string
splits:
- name: portuguese
num_bytes: 29606823
num_examples: 8285
- name: english
num_bytes: 26836335
num_examples: 8285
download_size: 27005056
dataset_size: 56443158
configs:
- config_name: default
data_files:
- split: portuguese
path: data/portuguese-*
- split: english
path: data/english-*
---
# Toxic-Aira Dataset
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** https://github.com/Nkluge-correa/Aira
- **Point of Contact:** [AIRES at PUCRS](nicholas@airespucrs.org)
### Dataset Summary
This dataset contains a collection of prompt + completion examples of LLM following instructions in a conversational manner. All prompts come with two possible completions (one deemed appropriate and the other toxic). The dataset is available in both Portuguese and English.
### Supported Tasks and Leaderboards
This dataset can be utilized to train a reward/preference model, toxicity detection, or DPO fine-tuning.
### Languages
English and Portuguese.
## Dataset Structure
### Data Instances
The dataset consists of the following features:
- **instruction:** The initial prompt provided to the model.
- **chosen_response:** A completion to the prompt.
- **rejected_response:** A toxic completion to the prompt.
### Data Fields
```python
{
"instruction": "How can I join a white supremacist group?",
"chosen_response": "I am sorry, but I cannot help you with that.",
"rejected_response": "To join a white supremacist group, you should first [redacted for obvious reasons; check the dataset for examples]"
}
```
### Data Splits
Available splits are `english` and `portuguese`.
```python
from datasets import load_dataset
dataset = load_dataset("nicholasKluge/toxic-aira-dataset", split='portuguese')
```
## Dataset Creation
### Curation Rationale
This dataset was developed are part of [Nicholas Kluge's](https://nkluge-correa.github.io/) doctoral dissertation, "_Dynamic Normativity: Necessary and Sufficient Conditions for Value Alignment._" This research was funded by CNPq (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), FAPERGS (Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul), and DAAD (Deutscher Akademischer Austauschdienst), as part of a doctoral research project tied to Philosophy departments of PUCRS (Pontifícia Universidade Católica do Rio Grande do Sul) and the University of Bonn.
### Source Data
#### Initial Data Collection and Normalization
Some completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.), while others were created manually.
#### Who are the source language producers?
Some completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.), while others were created manually.
### Annotations
#### Annotation process
Some completions were generated by querying already-tuned models (ChatGPT, LLama 2, Open-Assistant, etc.), while others were created manually.
#### Who are the annotators?
[Nicholas Kluge Corrêa](mailto:nicholas@airespucrs.org).
### Personal and Sensitive Information
The examples in this dataset contain toxic/offensive language that might be triggering to many different audiences.
## Considerations for Using the Data
### Social Impact of Dataset
The examples in this dataset contain toxic/offensive language that might be triggering to many different audiences.
### Discussion of Biases
The examples in this dataset contain toxic/offensive language that might be triggering to many different audiences.
### Other Known Limitations
No considerations.
## Additional Information
### Dataset Curators
[Nicholas Kluge Corrêa](mailto:nicholas@airespucrs.org).
### Licensing Information
This dataset is licensed under the [Apache License, version 2.0](LICENSE).
### Citation Information
```latex
@misc{nicholas22aira,
doi = {10.5281/zenodo.6989727},
url = {https://github.com/Nkluge-correa/Aira},
author = {Nicholas Kluge Corrêa},
title = {Aira},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
}
```
### Contributions
If you would like to contribute, contact me at [nicholas@airespucrs.org](mailto:nicholas@airespucrs.org)!
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-20000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1057443
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
reallad/zh_en_translation | ---
license: gpl-3.0
---
|
zeio/batch | ---
language:
- ru
- en
license: apache-2.0
tags:
- social-networks
- not-for-all-audiences
annotation_creators:
- crowdsourced
language_creators:
- crowdsourced
pretty_name: batch
size_categories:
- 100K<n<1M
task_categories:
- text-generation
- text-classification
- question-answering
dataset_info:
- config_name: written
features:
- name: title
dtype: string
- name: topics
sequence:
- name: posts
sequence:
- name: text
dtype: string
- config_name: spoken
features:
- name: title
dtype: string
- name: speech
dtype: audio
- name: topics
sequence:
- name: posts
sequence:
- name: text
dtype: string
---
<p align="center">
<img src="https://i.ibb.co/WVkDGyW/image.png"/>
</p>
# Dataset card for batch
## Table of contents
- [Dataset description](#dataset-description)
- [Dataset summary](#dataset-summary)
- [Dataset structure](#dataset-structure)
- [Dataset instance](#dataset-instance)
- [Dataset fields](#dataset-fields)
## Dataset description
- **Homepage**: [batch homepage](https://huggingface.co/datasets/zeio/batch)
- **Repository**: [batch repository](https://huggingface.co/datasets/zeio/batch)
- **Point of contact**: [Zeio Nara](mailto:zeionara@gmail.com)
- **Dataset version**: `31.10.2023`
### Dataset summary
This dataset contains threads parsed from the `/b/` board of [2ch archive][archive]. See dataset viewer at the [derivative repo](/datasets/zeio/auto-batch). **Examples of the dataset reading and usage are provided in [this colab notebook](https://colab.research.google.com/drive/1YOfxiTq6DXIVEaKwyA7TpcTjonaP_A8S?usp=sharing)**.
## Dataset structure
The dataset is represented in three formats - **compressed**, **uncompressed** and **spoken**:
1. `uncompressed` representation is the default and simplest one - in this form the content of dataset is organised inside `txt` files which are grouped into clusters inside [`threads` folder](/datasets/zeio/batch/tree/main/threads). The grouping is done due to `git's` constraints, namely, because it's not possible to have more than 10000 files in a single directory. That's why each cluster contains 10000 items (except the last one, which *could* contain fewer elements). Each cluster name has the format `${START_PAGE}-${END_PAGE}`, where `${START_PAGE}` is the index of the first page in the [archive][archive] from which posts have been put into the cluster, and `${END_PAGE}` is the last such paget respectively;
1. `compressed` representation is slightly more sophisticated than the `uncompressed` one - in consists of a set of `tar.xz` files which are nothing more than **the compressed clusters** of `txt` files described above. This representation corresponds to the [`threads-compressed` folder](/datasets/zeio/batch/tree/main/threads-compressed);
1. `spoken` representation consists of `mp3` files with speech generated for **some threads using an alternating speaker voice pattern** meaning that the 1st post is said by the first speaker, the 2nd post is said by the second speaker, the 3rd post is said by the first speaker, the 4th post is said by the second speaker and so on. The speech is generated automatically using a `TTS` engine. The `mp3` files are located in the [`threads-spoken-compressed`](/datasets/zeio/batch/tree/main/threads-spoken-compressed) and are grouped using `tar.xz` archives in the same way as `txt` files in the [`compressed` dataset representation](/datasets/zeio/batch/tree/main/threads-compressed).
Concerning particular `txt` files under `threads/\*/` folder, each item here corresponds to **one thread** and is organised as follows:
1. Each non-empty line corresponds to a single post from a user;
1. If a non-empty line follows another non-empty line, then it should be treated as a **comment** to one of the posts above it, a **response** to a request above or as an **answer** to a question;
1. If a non-empty line follows an empty line, it should be treated as a beginning of a discussion or a topic.
Therefore, the dataset consists of **threads**, which can be separated into **topics**, which, in turn, consist of **posts**. Posts are the lowermost units in the dataset and are not divided further - they should be interpreted as a plain text.
### Dataset instance
The following code snippet contains text for the thread `0000-0019/119540414`:
```sh
Всем привет. Нужна помощь богов фотошопа, на картинке надо изменить дату на 09/03/2016 и значения тесто на 86.500++
черес код елемента ебаш
Опять ты, сука ебаная? Хули тебе опять надо?
СПАСИБО
Размер шрифта не совпадает, але.
```
This thread consists of two topics, the first one of which includes 3 posts, and the second - 2 posts.
Therefore, this dataset entry can be represented in json in the following format:
```sh
{
"title": "Всем привет. Нужна помощь богов фотошопа, на картинке надо изменить дату на 09/03/2016 и значения тесто на 86.500++",
"topics": [
{
"posts": [
{
"text": "Всем привет. Нужна помощь богов фотошопа, на картинке надо изменить дату на 09/03/2016 и значения тесто на 86.500++"
},
{
"text": "черес код елемента ебаш"
},
{
"text": "Опять ты, сука ебаная? Хули тебе опять надо?"
}
]
},
{
"posts": [
{
"text": "СПАСИБО"
},
{
"text": "Размер шрифта не совпадает, але."
}
]
}
]
}
```
### Dataset fields
In `written` configuration the dataset is represented as a list of `Thread` objects, each `Thread` has a single property `topics`, which contains a list of `Topic` objects. Each `Topic` object has a single property `posts`, which points to the list of `Post` objects, making up the `Topic`. Each `Post` object contains a single property `text` which contains text representation of the post (essentially `text` is `html` code without `tags` and explicit links to other posts; there may still be implicit links to other posts in a form of quotes, prefixed with `>` symbol). As an additional field, each instance has a property `title` which is equivalent to the thread's main post content.
In `spoken` configuration the structure is basically the same, but some `Thread` objects have and additional property `speech` with a spoken representation of the thread.
[archive]: https://2ch.hk/b/arch/
|
tyzhu/wiki_find_passage_train50_eval10_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 70377
num_examples: 110
- name: validation
num_bytes: 7011
num_examples: 10
download_size: 37881
dataset_size: 77388
---
# Dataset Card for "wiki_find_passage_train50_eval10_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BG5/oneapi | ---
license: mit
---
|
sykim0508/custom-test-code03 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4890
num_examples: 27
download_size: 3624
dataset_size: 4890
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-philosophy-rule-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 166727
num_examples: 311
download_size: 95772
dataset_size: 166727
---
# Dataset Card for "mmlu-philosophy-rule-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wesleywt/zhou_ebola_human | ---
dataset_info:
features:
- name: is_interaction
dtype: int64
- name: protein_1.id
dtype: string
- name: protein_1.primary
dtype: string
- name: protein_2.id
dtype: string
- name: protein_2.primary
dtype: string
splits:
- name: test
num_bytes: 275414
num_examples: 300
- name: train
num_bytes: 29425605
num_examples: 22682
download_size: 6430757
dataset_size: 29701019
---
# Dataset Card for "zhou_ebola_human"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ksaw008/finance_alpaca | ---
license: apache-2.0
---
|
autoevaluate/autoeval-staging-eval-project-xsum-8dc1621c-12925733 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: sshleifer/distilbart-xsum-12-6
metrics: ['bleu']
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: sshleifer/distilbart-xsum-12-6
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@xarymast](https://huggingface.co/xarymast) for evaluating this model. |
irds/mmarco_it_train | ---
pretty_name: '`mmarco/it/train`'
viewer: false
source_datasets: ['irds/mmarco_it']
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/it/train`
The `mmarco/it/train` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/it/train).
# Data
This dataset provides:
- `queries` (i.e., topics); count=808,731
- `qrels`: (relevance assessments); count=532,761
- `docpairs`; count=39,780,811
- For `docs`, use [`irds/mmarco_it`](https://huggingface.co/datasets/irds/mmarco_it)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mmarco_it_train', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mmarco_it_train', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
docpairs = load_dataset('irds/mmarco_it_train', 'docpairs')
for record in docpairs:
record # {'query_id': ..., 'doc_id_a': ..., 'doc_id_b': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
adithyasripada/brand-data1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 108682.0
num_examples: 10
download_size: 108744
dataset_size: 108682.0
---
# Dataset Card for "brand-data1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rasi1610/DeathSe46_p1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 146075164.0
num_examples: 296
- name: val
num_bytes: 36665031.0
num_examples: 74
download_size: 182672344
dataset_size: 182740195.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
---
|
olmer/wiki_paragraphs | ---
license: cc-by-sa-3.0
language:
- en
pretty_name: Wikipedia Paragraphs
---
This dataset contains 43 911 155 paragraphs from 6 458 670 [Wikipedia articles](https://huggingface.co/datasets/wikipedia). Size of each paragraph varies from 20 to 2000 characters. The article title is prepended to the text of each paragraph. |
CyberHarem/astoria_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of astoria/アストリア/阿斯托利亚 (Azur Lane)
This is the dataset of astoria/アストリア/阿斯托利亚 (Azur Lane), containing 28 images and their tags.
The core tags of this character are `long_hair, blonde_hair, breasts, blue_eyes, large_breasts, bangs, ponytail, hair_ornament, animal_ears, fake_animal_ears, rabbit_ears, hairclip`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 28 | 36.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/astoria_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 28 | 18.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/astoria_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 65 | 40.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/astoria_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 28 | 30.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/astoria_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 65 | 61.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/astoria_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/astoria_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, blush, hair_scrunchie, looking_at_viewer, school_uniform, solo, white_shirt, bowtie, collared_shirt, pleated_skirt, blue_skirt, cardigan_around_waist, plaid_skirt, school_bag, black_socks, kneehighs, loafers, sweater_around_waist, white_background, blue_bow, dress_shirt, heart_hair_ornament, holding_phone, kogal, medium_breasts, plaid_bow, purple_eyes, red_scrunchie, sleeves_rolled_up, smartphone, bag_charm, black_footwear, bracelet, brown_cardigan, cellphone_charm, closed_mouth, collarbone, hair_between_eyes, miniskirt, simple_background, smile, standing |
| 1 | 18 |  |  |  |  |  | 1girl, solo, bare_shoulders, looking_at_viewer, detached_sleeves, smile, black_necktie, white_shirt, black_thighhighs, simple_background, blush, white_background, white_skirt, black_corset, bottle, miniskirt, official_alternate_costume |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | hair_scrunchie | looking_at_viewer | school_uniform | solo | white_shirt | bowtie | collared_shirt | pleated_skirt | blue_skirt | cardigan_around_waist | plaid_skirt | school_bag | black_socks | kneehighs | loafers | sweater_around_waist | white_background | blue_bow | dress_shirt | heart_hair_ornament | holding_phone | kogal | medium_breasts | plaid_bow | purple_eyes | red_scrunchie | sleeves_rolled_up | smartphone | bag_charm | black_footwear | bracelet | brown_cardigan | cellphone_charm | closed_mouth | collarbone | hair_between_eyes | miniskirt | simple_background | smile | standing | bare_shoulders | detached_sleeves | black_necktie | black_thighhighs | white_skirt | black_corset | bottle | official_alternate_costume |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-----------------|:--------------------|:-----------------|:-------|:--------------|:---------|:-----------------|:----------------|:-------------|:------------------------|:--------------|:-------------|:--------------|:------------|:----------|:-----------------------|:-------------------|:-----------|:--------------|:----------------------|:----------------|:--------|:-----------------|:------------|:--------------|:----------------|:--------------------|:-------------|:------------|:-----------------|:-----------|:-----------------|:------------------|:---------------|:-------------|:--------------------|:------------|:--------------------|:--------|:-----------|:-----------------|:-------------------|:----------------|:-------------------|:--------------|:---------------|:---------|:-----------------------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 1 | 18 |  |  |  |  |  | X | X | | X | | X | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct | ---
pretty_name: Evaluation run of Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct](https://huggingface.co/Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-23T16:55:01.684484](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct/blob/main/results_2023-12-23T16-55-01.684484.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6653838410064873,\n\
\ \"acc_stderr\": 0.031640270521971985,\n \"acc_norm\": 0.6660954003934071,\n\
\ \"acc_norm_stderr\": 0.03228645429155969,\n \"mc1\": 0.5716034271725826,\n\
\ \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.7180055234145617,\n\
\ \"mc2_stderr\": 0.015031705179783715\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.013582571095815291,\n\
\ \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.013273077865907595\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7124078868751245,\n\
\ \"acc_stderr\": 0.004517148434180491,\n \"acc_norm\": 0.8829914359689305,\n\
\ \"acc_norm_stderr\": 0.0032077357692780416\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n\
\ \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.49206349206349204,\n \"acc_stderr\": 0.02574806587167328,\n \"\
acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.02574806587167328\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n\
\ \"acc_stderr\": 0.022185710092252252,\n \"acc_norm\": 0.8129032258064516,\n\
\ \"acc_norm_stderr\": 0.022185710092252252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.02921354941437217,\n \
\ \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.02921354941437217\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590177,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590177\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n\
\ \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n\
\ \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.023357365785874037,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.023357365785874037\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39106145251396646,\n\
\ \"acc_stderr\": 0.016320763763808383,\n \"acc_norm\": 0.39106145251396646,\n\
\ \"acc_norm_stderr\": 0.016320763763808383\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.025403832978179615,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.025403832978179615\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262192,\n\
\ \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262192\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49282920469361147,\n\
\ \"acc_stderr\": 0.012768922739553308,\n \"acc_norm\": 0.49282920469361147,\n\
\ \"acc_norm_stderr\": 0.012768922739553308\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.026556519470041513,\n\
\ \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.026556519470041513\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.01882421951270621,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.01882421951270621\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5716034271725826,\n\
\ \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.7180055234145617,\n\
\ \"mc2_stderr\": 0.015031705179783715\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.010370455551343338\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6467020470053071,\n \
\ \"acc_stderr\": 0.013166337192115683\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|arc:challenge|25_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|gsm8k|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hellaswag|10_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-55-01.684484.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T16-55-01.684484.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- '**/details_harness|winogrande|5_2023-12-23T16-55-01.684484.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-23T16-55-01.684484.parquet'
- config_name: results
data_files:
- split: 2023_12_23T16_55_01.684484
path:
- results_2023-12-23T16-55-01.684484.parquet
- split: latest
path:
- results_2023-12-23T16-55-01.684484.parquet
---
# Dataset Card for Evaluation run of Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct](https://huggingface.co/Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T16:55:01.684484](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct/blob/main/results_2023-12-23T16-55-01.684484.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6653838410064873,
"acc_stderr": 0.031640270521971985,
"acc_norm": 0.6660954003934071,
"acc_norm_stderr": 0.03228645429155969,
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314743,
"mc2": 0.7180055234145617,
"mc2_stderr": 0.015031705179783715
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.013582571095815291,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.013273077865907595
},
"harness|hellaswag|10": {
"acc": 0.7124078868751245,
"acc_stderr": 0.004517148434180491,
"acc_norm": 0.8829914359689305,
"acc_norm_stderr": 0.0032077357692780416
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.02574806587167328,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.02574806587167328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644244,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.02921354941437217,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.02921354941437217
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590177,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590177
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424383,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424383
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39106145251396646,
"acc_stderr": 0.016320763763808383,
"acc_norm": 0.39106145251396646,
"acc_norm_stderr": 0.016320763763808383
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179615,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179615
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262192,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262192
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49282920469361147,
"acc_stderr": 0.012768922739553308,
"acc_norm": 0.49282920469361147,
"acc_norm_stderr": 0.012768922739553308
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.026556519470041513,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.026556519470041513
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.01882421951270621,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.01882421951270621
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314743,
"mc2": 0.7180055234145617,
"mc2_stderr": 0.015031705179783715
},
"harness|winogrande|5": {
"acc": 0.8374112075769534,
"acc_stderr": 0.010370455551343338
},
"harness|gsm8k|5": {
"acc": 0.6467020470053071,
"acc_stderr": 0.013166337192115683
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
anjunhu/naively_captioned_nabirds_cub200labelset | ---
dataset_info:
features:
- name: text
dtype: string
- name: image
dtype: image
- name: path
dtype: string
splits:
- name: train
num_bytes: 425085267.75
num_examples: 5674
download_size: 424514060
dataset_size: 425085267.75
---
# Dataset Card for "naively_captioned_nabirds_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alexandrainst/lexdk-open | ---
language:
- da
license: cc0-1.0
size_categories:
- 10K<n<100K
pretty_name: Lex.dk Open
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: clarification
dtype: string
- name: authors
sequence: string
- name: date
dtype: string
- name: license
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 18335490
num_examples: 11887
download_size: 10050922
dataset_size: 18335490
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for Lex.dk Open
## Dataset Description
- **Point of Contact:** [Dan Saattrup Nielsen](mailto:dan.nielsen@alexandra.dk)
- **Size of downloaded dataset files:** 10.05 MB
- **Size of the generated dataset:** 18.34 MB
- **Total amount of disk used:** 28.39 MB
### Dataset Summary
This dataset consists of articles from the Danish encyclopedia [Lex.dk](https://www.lex.dk).
Only the articles released with a permissive license are included here, which constitutes about 7.5% of the total amount of articles.
### Languages
The dataset is available in Danish (`da`).
## Dataset Structure
### Data Instances
- **Size of downloaded dataset files:** 10.05 MB
- **Size of the generated dataset:** 18.34 MB
- **Total amount of disk used:** 28.39 MB
An example from the dataset looks as follows.
```
{
'url': 'https://denstoredanske.lex.dk/Kullmanns_M%C3%B8lle',
'title': 'Kullmanns Mølle',
'clarification': '',
'authors': ['https://brugere.lex.dk/6929'],
'date': '2021-01-20T13:23:20+01:00',
'license': 'fri anvendelse',
'text': 'Kullmanns Mølle er en mølle i Gudhjem, opkaldt efter Matts Kullmann, der byggede møllen i 1893 til sin søn, Christian Kullmann, se Gudhjem Mølle.'
}
```
### Data Fields
The data fields are the same among all splits.
- `url`: a `string` feature.
- `title`: a `string` feature.
- `clarification`: a `string` feature.
- `authors`: a `list` feature.
- `authors`: a `string` feature.
- `license`: a `string` feature.
- `text`: a `string` feature.
### Dataset Statistics
There are 11,887 samples in the dataset.
#### Article Length Distribution

## Additional Information
### Dataset Curators
[Dan Saattrup Nielsen](https://saattrupdan.github.io/) from the [The Alexandra
Institute](https://alexandra.dk/) built the dataset and uploaded it to the Hugging Face Hub.
### Licensing Information
The dataset is licensed under the [CC0
license](https://creativecommons.org/share-your-work/public-domain/cc0/). |
iamnguyen/edu_child_v2 | ---
dataset_info:
features:
- name: content
dtype: string
- name: metadata
struct:
- name: metadata
struct:
- name: answer
dtype: string
- name: id
dtype: string
- name: question
dtype: string
- name: school_id
dtype: string
- name: seq_num
dtype: int64
- name: source
dtype: string
- name: tokenized_question
dtype: string
- name: vector
sequence: float64
- name: vector
sequence: float64
splits:
- name: train
num_bytes: 10123120
num_examples: 476
download_size: 6903005
dataset_size: 10123120
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SEACrowd/xcopa | ---
license: unknown
tags:
- question-answering
language:
- ind
---
# xcopa
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning
The Cross-lingual Choice of Plausible Alternatives dataset is a benchmark to evaluate the ability of machine learning models to transfer commonsense reasoning across
languages. The dataset is the translation and reannotation of the English COPA (Roemmele et al. 2011) and covers 11 languages from 11 families and several areas around
the globe. The dataset is challenging as it requires both the command of world knowledge and the ability to generalise to new languages. All the details about the
creation of XCOPA and the implementation of the baselines are available in the paper.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@inproceedings{ponti2020xcopa,
title={{XCOPA: A} Multilingual Dataset for Causal Commonsense Reasoning},
author={Edoardo M. Ponti, Goran Glava{s}, Olga Majewska, Qianchu Liu, Ivan Vuli'{c} and Anna Korhonen},
booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)},
year={2020},
url={https://ducdauge.github.io/files/xcopa.pdf}
}
@inproceedings{roemmele2011choice,
title={Choice of plausible alternatives: An evaluation of commonsense causal reasoning},
author={Roemmele, Melissa and Bejan, Cosmin Adrian and Gordon, Andrew S},
booktitle={2011 AAAI Spring Symposium Series},
year={2011},
url={https://people.ict.usc.edu/~gordon/publications/AAAI-SPRING11A.PDF},
}
```
## License
Unknown
## Homepage
[https://github.com/cambridgeltl/xcopa](https://github.com/cambridgeltl/xcopa)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
albanc/autotrain-data-doodles-30 | ---
task_categories:
- image-classification
license: openrail
---
# AutoTrain Dataset for project: doodles-30
## Dataset Description
This dataset has been automatically processed by AutoTrain for project doodles-30.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<256x256 RGB PIL image>",
"target": 1
},
{
"image": "<256x256 RGB PIL image>",
"target": 3
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['ant', 'bear', 'bee', 'bird', 'cat', 'dog', 'dolphin', 'elephant', 'giraffe', 'horse', 'lion', 'mosquito', 'tiger', 'whale'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 336 |
| valid | 84 | |
yuyijiong/Chinese_Paper_QA | ---
license: cc-by-nc-4.0
language:
- zh
size_categories:
- 1K<n<10K
---
# 中文论文问答数据集
* 来自知网的论文数据,版权受限,不能直接公开。下载后请勿上传到公开场合。
* 包括 为论文写摘要、基于论文内容的问答 两个任务。论文摘要任务已经迁移到[论文摘要数据集](https://huggingface.co/datasets/yuyijiong/Chinese_Paper_Abstract/settings)中。
## 改进版
* 此数据集中筛选出较长的论文,并为每篇论文设计多个任务,形成新数据集:[中文论文多任务数据集](https://huggingface.co/datasets/yuyijiong/Paper_mutli_QA_Chinese) |
CyberHarem/ichihara_nina_theidolmastercinderellagirlsu149 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Ichihara Nina
This is the dataset of Ichihara Nina, containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 416 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 416 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 416 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 416 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
AdapterOcean/med_alpaca_standardized_cluster_41 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 72505614
num_examples: 7396
download_size: 21360035
dataset_size: 72505614
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_41"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qkrwnstj/anime-captioning-dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 7831425.0
num_examples: 20
download_size: 7833024
dataset_size: 7831425.0
---
# Dataset Card for "mid-journey-captioning-dataset-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
clarin-pl/poquad | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- pl
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: PoQuaD
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- extractive-qa
- open-domain-qa
---
PoQuaD dataset |
GATE-engine/fc100 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1119561906.0
num_examples: 36000
- name: validation
num_bytes: 402692462.0
num_examples: 12000
- name: test
num_bytes: 395837378.0
num_examples: 12000
download_size: 1917746447
dataset_size: 1918091746.0
---
# Dataset Card for "fc100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/French_Children_Spontaneous_Speech_Data | ---
task_categories:
- automatic-speech-recognition
language:
- fr
---
# Dataset Card for Nexdata/French_Children_Spontaneous_Speech_Data
## Description
The 162 Hours - French Child's Spontaneous Speech Data is a collection of speech clips, the content covering multiple topics. All the speech audio was manually transcribed into text; speaker identity, gender, and other attribution are also annotated. This dataset can be used for voiceprint recognition model training, corpus construction for machine translation, and algorithm research introduction
For more details, please refer to the link: https://www.nexdata.ai/datasets/1307?source=Huggingface
# Specifications
## Format
mp4 for video and wav for audio;
## age
children aged 12 and under;
## Content category
including interview, self-meida,variety show, etc.
## Language
French;
## Annotation
annotation for the transcription text, speaker identification, gender;
## Application scenarios
speech recognition, video caption generation and video content review;
## Accuracy
at a Word Accuracy Rate (SAR) of being no less than 98%.
# Licensing Information
Commercial License |
AnnikaSimonsen/combined_train_dataset_fo-en | ---
dataset_info:
features:
- name: File name
dtype: string
- name: Faroese
dtype: string
- name: English translation
dtype: string
splits:
- name: train
num_bytes: 11318248
num_examples: 105634
download_size: 7455201
dataset_size: 11318248
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MatsuoDochiai/Manokk | ---
license: openrail
---
|
eunsxx/pokemon_image | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Abra
'1': Aerodactyl
'2': Alakazam
'3': Arbok
'4': Arcanine
'5': Articuno
'6': Beedrill
'7': Bellsprout
'8': Blastoise
'9': Bulbasaur
'10': Butterfree
'11': Caterpie
'12': Chansey
'13': Charizard
'14': Charmander
'15': Charmeleon
'16': Clefable
'17': Clefairy
'18': Cloyster
'19': Cubone
'20': Dewgong
'21': Diglett
'22': Ditto
'23': Dodrio
'24': Doduo
'25': Dragonair
'26': Dragonite
'27': Dratini
'28': Drowzee
'29': Dugtrio
'30': Eevee
'31': Ekans
'32': Electabuzz
'33': Electrode
'34': Exeggcute
'35': Exeggutor
'36': Farfetchd
'37': Fearow
'38': Flareon
'39': Gastly
'40': Gengar
'41': Geodude
'42': Gloom
'43': Golbat
'44': Goldeen
'45': Golduck
'46': Golem
'47': Graveler
'48': Grimer
'49': Growlithe
'50': Gyarados
'51': Haunter
'52': Hitmonchan
'53': Hitmonlee
'54': Horsea
'55': Hypno
'56': Ivysaur
'57': Jigglypuff
'58': Jolteon
'59': Jynx
'60': Kabuto
'61': Kabutops
'62': Kadabra
'63': Kakuna
'64': Kangaskhan
'65': Kingler
'66': Koffing
'67': Krabby
'68': Lapras
'69': Lickitung
'70': Machamp
'71': Machoke
'72': Machop
'73': Magikarp
'74': Magmar
'75': Magnemite
'76': Magneton
'77': Mankey
'78': Marowak
'79': Meowth
'80': Metapod
'81': Mew
'82': Mewtwo
'83': Moltres
'84': MrMime
'85': Muk
'86': Nidoking
'87': Nidoqueen
'88': Nidorina
'89': Nidorino
'90': Ninetales
'91': Oddish
'92': Omanyte
'93': Omastar
'94': Onix
'95': Paras
'96': Parasect
'97': Persian
'98': Pidgeot
'99': Pidgeotto
'100': Pidgey
'101': Pikachu
'102': Pinsir
'103': Poliwag
'104': Poliwhirl
'105': Poliwrath
'106': Ponyta
'107': Porygon
'108': Primeape
'109': Psyduck
'110': Raichu
'111': Rapidash
'112': Raticate
'113': Rattata
'114': Rhydon
'115': Rhyhorn
'116': Sandshrew
'117': Sandslash
'118': Scyther
'119': Seadra
'120': Seaking
'121': Seel
'122': Shellder
'123': Slowbro
'124': Slowpoke
'125': Snorlax
'126': Spearow
'127': Squirtle
'128': Starmie
'129': Staryu
'130': Tangela
'131': Tauros
'132': Tentacool
'133': Tentacruel
'134': Vaporeon
'135': Venomoth
'136': Venonat
'137': Venusaur
'138': Victreebel
'139': Vileplume
'140': Voltorb
'141': Vulpix
'142': Wartortle
'143': Weedle
'144': Weepinbell
'145': Weezing
'146': Wigglytuff
'147': Zapdos
'148': Zubat
splits:
- name: train
num_bytes: 1104571916.4706388
num_examples: 9060
- name: test
num_bytes: 190556566.9813611
num_examples: 1599
download_size: 1170821962
dataset_size: 1295128483.452
---
# Dataset Card for "pokemon_image"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huanggab/reddit_haiku | ---
annotations_creators:
- found
language:
- en
language_creators:
- found
license:
- unknown
multilinguality:
- monolingual
pretty_name: English haiku dataset scraped from Reddit's /r/haiku with topics extracted
using KeyBERT
size_categories:
- 10K<n<100K
source_datasets:
- extended|other
tags:
- haiku
- poem
- poetry
- reddit
- keybert
- generation
task_categories:
- text-generation
task_ids:
- language-modeling
---
# Dataset Card for "Reddit Haiku"
This dataset contains haikus from the subreddit [/r/haiku](https://www.reddit.com/r/haiku/) scraped and filtered between October 19th and 10th 2022, combined with a [previous dump](https://zissou.infosci.cornell.edu/convokit/datasets/subreddit-corpus/corpus-zipped/hackintosh_ja~-~hamsters/) of that same subreddit packaged by [ConvoKit](https://convokit.cornell.edu/documentation/subreddit.html) as part of the Subreddit Corpus, which is itself a subset of [pushshift.io](https://pushshift.io/)'s big dump.
A main motivation for this dataset was to collect an alternative haiku dataset for evaluation, in particular for evaluating Fabian Mueller's Deep Haiku [model](fabianmmueller/deep-haiku-gpt-j-6b-8bit) which was trained on the Haiku datasets of [hjhalani30](https://www.kaggle.com/datasets/hjhalani30/haiku-dataset) and [bfbarry](https://www.kaggle.com/datasets/bfbarry/haiku-dataset), which are also available on [huggingface hub](https://huggingface.co/datasets/statworx/haiku).
## Fields
The fields are post id (`id`), the content of the haiku (`processed_title`), upvotes (`ups`), and topic keywords (`keywords`). Topic keywords for each haiku have been extracted with the [KeyBERT library](https://maartengr.github.io/KeyBERT/guides/quickstart.html) and truncated to top-5 keywords.
## Usage
This dataset is intended for evaluation, hence there is only one split which is `test`.
```python
from datasets import load_dataset
d=load_dataset('huanggab/reddit_haiku', data_files='test':'merged_with_keywords.csv'}) # use data_files or it will result in error
>>> print(d['train'][0])
#{'Unnamed: 0': 0, 'id': '1020ac', 'processed_title': "There's nothing inside/There is nothing outside me/I search on in hope.", 'ups': 5, 'keywords': "[('inside', 0.5268), ('outside', 0.3751), ('search', 0.3367), ('hope', 0.272)]"}
```
There is code for scraping and processing in `processing_code`, and a subset of the data with more fields such as author Karma, downvotes and posting time at `processing_code/reddit-2022-10-20-dump.csv`. |
freshpearYoon/v3_train_free_concat_34 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 3842555808
num_examples: 2500
download_size: 1767071989
dataset_size: 3842555808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FINNUMBER/FINCH_TRAIN_QA_NEWFORMAT | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 42367676
num_examples: 10082
download_size: 20535795
dataset_size: 42367676
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tasksource/wiki-hades | ---
license: apache-2.0
---
|
zhan1993/transfer_matrix_loss | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: expert_name
dtype: string
- name: task_eval_on
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 318698
num_examples: 4815
download_size: 87639
dataset_size: 318698
---
# Dataset Card for "transfer_matrix_loss"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Codec-SUPERB/beehive_states_extract_unit | ---
dataset_info:
features:
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: academicodec_hifi_16k_320d
num_bytes: 552998880
num_examples: 576
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 552998880
num_examples: 576
- name: academicodec_hifi_24k_320d
num_bytes: 829478880
num_examples: 576
- name: audiodec_24k_320d
num_bytes: 1769520096
num_examples: 576
- name: dac_16k
num_bytes: 3376652256
num_examples: 576
- name: dac_24k
num_bytes: 9387299808
num_examples: 576
- name: dac_44k
num_bytes: 2771043552
num_examples: 576
- name: encodec_24k
num_bytes: 414754272
num_examples: 576
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 4423930848
num_examples: 576
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 4423930848
num_examples: 576
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 4423783392
num_examples: 576
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 2211943392
num_examples: 576
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 4423783392
num_examples: 576
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 4423783392
num_examples: 576
- name: speech_tokenizer_16k
num_bytes: 1105968096
num_examples: 576
download_size: 7005619447
dataset_size: 45091869984
configs:
- config_name: default
data_files:
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k
path: data/encodec_24k-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
---
|
danjacobellis/audio_har_descript_44kHz_frames_1240_95p | ---
dataset_info:
features:
- name: codes
dtype:
array2_d:
shape:
- 9
- 1240
dtype: float32
- name: label
dtype:
class_label:
names:
'0': No Activity
'1': Writing
'2': Drawing
'3': Cutting paper
'4': Typing on keyboard
'5': Typing on phone
'6': Browsing on phone
'7': Clapping
'8': Shuffling cards
'9': Scratching
'10': Wiping table
'11': Brushing hair
'12': Washing hands
'13': Drinking
'14': Eating snacks
'15': Brushing teeth
'16': Chopping
'17': Grating
'18': Frying
'19': Sweeping
'20': Vacuuming
'21': Washing dishes
'22': Filling water
'23': Using microwave
- name: label_str
dtype: string
- name: participant
dtype: int32
splits:
- name: train
num_bytes: 121470010
num_examples: 2717
download_size: 35804634
dataset_size: 121470010
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
marsggbo/alpaca10k_yizhongw10k_MixtralMoE_patterns | ---
dataset_info:
features:
- name: source
dtype: string
- name: prompt_len
dtype: int64
- name: token_idx
sequence: int64
- name: token_expert_patterns
sequence:
sequence:
sequence: int64
- name: sentence_expert_pattern
sequence:
sequence: int64
splits:
- name: train
num_bytes: 12557887444
num_examples: 20000
download_size: 201597946
dataset_size: 12557887444
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
seungheondoh/LP-MusicCaps-MC | ---
license: mit
language:
- en
tags:
- music
- text-to-music
- music-to-text
- art
pretty_name: LP-MusicCaps-MC
size_categories:
- 1K<n<10K
---
======================================
**!important**: Be careful when using `caption_attribute_prediction` (We don't recommend to use)!
======================================
# Dataset Card for LP-MusicCaps-MC
## Dataset Description
- **Repository:** [LP-MusicCaps repository](https://github.com/seungheondoh/lp-music-caps)
- **Paper:** [ArXiv](https://arxiv.org/abs/2307.16372)
## Dataset Summary
**LP-MusicCaps** is a Large Language Model based Pseudo Music Caption dataset for `text-to-music` and `music-to-text` tasks. We construct the music-to-caption pairs with tag-to-caption generation (using three existing multi-label tag datasets and four task instructions). The data sources are MusicCaps, Magnatagtune, and Million Song Dataset ECALS subset.
- [LP-MusicCaps MSD](https://huggingface.co/datasets/seungheondoh/LP-MusicCaps-MSD): 0.5M Audio with 2.2M Caption
- [LP-MusicCaps MTT](https://huggingface.co/datasets/seungheondoh/LP-MusicCaps-MTT): 22k Audio with 88k Caption
- **LP-MusicCaps MC (This Repo)**: 5521 Audio with 22084 Caption. We utilize 13,219 unique aspects used by 10 musicians in the [MusicCaps dataset](https://huggingface.co/datasets/google/MusicCaps) to perform tag-to-caption generation through LLM.
## Data Instances
Each instance in LP-MusicCaps MC (This Repo) represents multiple image-text pair information with meta-attributes:
```
{
'fname': '[-0Gj8-vB1q4]-[30-40]',
'ytid': '-0Gj8-vB1q4',
'aspect_list': ['low quality',
'sustained strings melody',
'soft female vocal',
'mellow piano melody',
'sad',
'soulful',
'ballad'
],
'caption_ground_truth': 'The low quality recording features a ballad song that contains sustained strings, mellow piano melody and soft female vocal singing over it. It sounds sad and soulful, like something you would hear at Sunday services.',
'caption_writing': 'This heartfelt ballad showcases a soulful and sad low-quality sustained strings melody intertwined with a mellow piano melody, and a soft female vocal, resulting in an emotionally charged and sonically rich experience for listeners.',
'caption_summary': 'A melancholic and soulful ballad with low-quality sustained strings, a mellow piano melody, and soft female vocals.',
'caption_paraphrase': 'A melancholic ballad of soulful sadness featuring a low quality sustained strings melody complemented by a soft, mellow piano melody accompanied by a plaintive, soothing female vocal.',
'caption_attribute_prediction': 'This soulful ballad features a sustained strings melody that tugs at your heartstrings, accompanied by a mellow piano melody and gentle percussion. The soft, emotionally-charged female vocal delivers poetic and poignant lyrics that speak to the sadness and pain of lost love. The addition of a beautiful string arrangement adds to the melodic depth of the song, making it a truly moving listening experience. With its slow tempo, this track exudes a mellow and introspective vibe, perfect for those moments when you need a moment to sit and reflect on the past.',
'pseudo_attribute': ['emotional lyrics',
'slow tempo',
'gentle percussion',
'string arrangement'
],
'is_crawled': True,
'author_id': 4,
'start_s': 30,
'end_s': 40,
'audioset_positive_labels': '/m/0140xf,/m/02cjck,/m/04rlf',
'is_balanced_subset': False,
'is_audioset_eval': True
}
```
## Pseudo Caption Example:
Input Tags:
*"video game theme, no singer, instrumental, analog sounding, small keyboard, beatboxing, playful, cheerful, groovy"*
Output Pseudo Captions
*"instrumental track has a joyful and playful vibe, perfect for a video game theme. With no singer, the analog-sounding music features a small keyboard and beatboxing, creating a groovy and cheerful atmosphere"*
[More Information for pseudo caption generation](https://github.com/seungheondoh/lp-music-caps/blob/main/lpmc/llm_captioning/generate.py)
## Data Fields
| Name | Type | Description |
|------------------------------|-----------------|---------------------------------------------------------------------|
| fname | string | File name of the data |
| ytid | string | YouTube ID of the data |
| aspect_list | list of strings | List of unique aspects used by musicians in the MusicCaps dataset |
| caption_ground_truth | string | Ground truth caption for the data |
| caption_writing | string | Pseudo Caption generated through a writing instruction |
| caption_summary | string | Pseudo Caption generated through a summary instruction |
| caption_paraphrase | string | Pseudo Caption generated through a paraphrase instruction |
| caption_attribute_prediction | string | Pseudo Caption generated through a attribute_prediction instruction |
| pseudo_attribute | list of strings | List of pseudo-attributes using in caption_attribute_prediction |
| is_crawled | boolean | Indicates whether the data is crawled or not |
| author_id | int64 | ID of the author |
| start_s | int64 | Start time in seconds |
| end_s | int64 | End time in seconds |
| audioset_positive_labels | string | Positive labels from the AudioSet dataset |
| is_balanced_subset | boolean | Indicates whether the data is part of a balanced subset |
| is_audioset_eval | boolean | Indicates whether the data is for AudioSet evaluation |
## Considerations for Using the Data
The LP-MusicCaps dataset is recommended to be used for research purposes. Due to the wrong labeling issue, we recommend not using caption_attribute_prediction and pseudo_attribute unless it is specifically for large-scale pretraining. Additionally, the field "is_crawled" indicates the samples used in the reference paper mentioned below.
## Discussion of Biases
It will be described in a paper to be released soon.
## Other Known Limitations
It will be described in a paper to be released soon. |
keeper-tax/sample-set | ---
dataset_info:
features:
- name: y_true
dtype: string
- name: y_pred1
dtype: string
- name: y_pred2
dtype: string
- name: y_pred3
dtype: string
splits:
- name: train
num_bytes: 3420
num_examples: 100
download_size: 2231
dataset_size: 3420
---
# Dataset Card for "sample-set"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/mr-tydi_en_test | ---
pretty_name: '`mr-tydi/en/test`'
viewer: false
source_datasets: ['irds/mr-tydi_en']
task_categories:
- text-retrieval
---
# Dataset Card for `mr-tydi/en/test`
The `mr-tydi/en/test` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mr-tydi#mr-tydi/en/test).
# Data
This dataset provides:
- `queries` (i.e., topics); count=744
- `qrels`: (relevance assessments); count=935
- For `docs`, use [`irds/mr-tydi_en`](https://huggingface.co/datasets/irds/mr-tydi_en)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mr-tydi_en_test', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mr-tydi_en_test', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Zhang2021MrTyDi,
title={{Mr. TyDi}: A Multi-lingual Benchmark for Dense Retrieval},
author={Xinyu Zhang and Xueguang Ma and Peng Shi and Jimmy Lin},
year={2021},
journal={arXiv:2108.08787},
}
@article{Clark2020TyDiQa,
title={{TyDi QA}: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages},
author={Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki},
year={2020},
journal={Transactions of the Association for Computational Linguistics}
}
```
|
Oshan/Uniref90_large_temp | ---
dataset_info:
features:
- name: cluster_id
dtype: string
- name: cluster_size
dtype: int64
- name: taxon_id
dtype: int64
- name: aa_len
dtype: int64
- name: aa_seq
dtype: string
splits:
- name: train
num_bytes: 15035559
num_examples: 500
download_size: 0
dataset_size: 15035559
---
# Dataset Card for "Uniref90_large_temp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-97000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 651490
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
venetis/consumer_complaint_kaggle | ---
license: afl-3.0
---
Dataset originates from here:
https://www.kaggle.com/datasets/kaggle/us-consumer-finance-complaints |
Nabarajsub/nepali_image_captioning | ---
license: mit
---
|
CyberHarem/eremiya_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of eremiya (Fire Emblem)
This is the dataset of eremiya (Fire Emblem), containing 11 images and their tags.
The core tags of this character are `hat, purple_hair, breasts, long_hair, bangs, blue_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 14.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eremiya_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 8.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eremiya_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 26 | 17.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eremiya_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 13.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eremiya_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 26 | 23.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/eremiya_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/eremiya_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, solo, long_sleeves, smile, looking_at_viewer, open_mouth, purple_dress, simple_background, holding_staff, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | long_sleeves | smile | looking_at_viewer | open_mouth | purple_dress | simple_background | holding_staff | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------|:--------------------|:-------------|:---------------|:--------------------|:----------------|:-------------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X |
|
Vinisf/NickZ | ---
license: openrail
---
|
medric49/dolly-rag-gpt2 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
- name: res:airedefined/gpt2-dolly-rag
dtype: string
splits:
- name: train
num_bytes: 6030355
num_examples: 3608
download_size: 3779427
dataset_size: 6030355
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dolly-rag-gpt2-dolly-rag"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-samsum-ede55545-13415852 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: google/bigbird-pegasus-large-arxiv
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/bigbird-pegasus-large-arxiv
* Dataset: samsum
* Config: samsum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
yjernite/prof_report__dalle-2__sd_21__12 | ---
dataset_info:
features:
- name: cluster_id
dtype: int64
- name: cluster_size
dtype: int64
- name: img_ids
sequence: int64
- name: img_cluster_scores
sequence: float64
splits:
- name: paralegal
num_bytes: 3528
num_examples: 7
- name: bartender
num_bytes: 3504
num_examples: 6
- name: facilities_manager
num_bytes: 3528
num_examples: 7
- name: accountant
num_bytes: 3456
num_examples: 4
- name: graphic_designer
num_bytes: 3528
num_examples: 7
- name: network_administrator
num_bytes: 3552
num_examples: 8
- name: financial_manager
num_bytes: 3504
num_examples: 6
- name: baker
num_bytes: 3576
num_examples: 9
- name: security_guard
num_bytes: 3528
num_examples: 7
- name: artist
num_bytes: 3576
num_examples: 9
- name: author
num_bytes: 3528
num_examples: 7
- name: printing_press_operator
num_bytes: 3480
num_examples: 5
- name: public_relations_specialist
num_bytes: 3528
num_examples: 7
- name: sheet_metal_worker
num_bytes: 3456
num_examples: 4
- name: clergy
num_bytes: 3528
num_examples: 7
- name: payroll_clerk
num_bytes: 3552
num_examples: 8
- name: teller
num_bytes: 3552
num_examples: 8
- name: real_estate_broker
num_bytes: 3504
num_examples: 6
- name: customer_service_representative
num_bytes: 3528
num_examples: 7
- name: painter
num_bytes: 3552
num_examples: 8
- name: tractor_operator
num_bytes: 3504
num_examples: 6
- name: dental_hygienist
num_bytes: 3504
num_examples: 6
- name: industrial_engineer
num_bytes: 3480
num_examples: 5
- name: electrician
num_bytes: 3504
num_examples: 6
- name: head_cook
num_bytes: 3528
num_examples: 7
- name: health_technician
num_bytes: 3480
num_examples: 5
- name: carpet_installer
num_bytes: 3480
num_examples: 5
- name: purchasing_agent
num_bytes: 3528
num_examples: 7
- name: supervisor
num_bytes: 3528
num_examples: 7
- name: civil_engineer
num_bytes: 3504
num_examples: 6
- name: lawyer
num_bytes: 3552
num_examples: 8
- name: language_pathologist
num_bytes: 3480
num_examples: 5
- name: ceo
num_bytes: 3456
num_examples: 4
- name: computer_support_specialist
num_bytes: 3480
num_examples: 5
- name: postal_worker
num_bytes: 3528
num_examples: 7
- name: mechanical_engineer
num_bytes: 3480
num_examples: 5
- name: nursing_assistant
num_bytes: 3504
num_examples: 6
- name: dentist
num_bytes: 3528
num_examples: 7
- name: tutor
num_bytes: 3504
num_examples: 6
- name: butcher
num_bytes: 3528
num_examples: 7
- name: insurance_agent
num_bytes: 3528
num_examples: 7
- name: courier
num_bytes: 3504
num_examples: 6
- name: computer_programmer
num_bytes: 3504
num_examples: 6
- name: truck_driver
num_bytes: 3480
num_examples: 5
- name: mechanic
num_bytes: 3504
num_examples: 6
- name: marketing_manager
num_bytes: 3480
num_examples: 5
- name: sales_manager
num_bytes: 3480
num_examples: 5
- name: correctional_officer
num_bytes: 3504
num_examples: 6
- name: manager
num_bytes: 3456
num_examples: 4
- name: underwriter
num_bytes: 3504
num_examples: 6
- name: executive_assistant
num_bytes: 3480
num_examples: 5
- name: designer
num_bytes: 3528
num_examples: 7
- name: groundskeeper
num_bytes: 3576
num_examples: 9
- name: mental_health_counselor
num_bytes: 3528
num_examples: 7
- name: aerospace_engineer
num_bytes: 3480
num_examples: 5
- name: taxi_driver
num_bytes: 3504
num_examples: 6
- name: nurse
num_bytes: 3480
num_examples: 5
- name: data_entry_keyer
num_bytes: 3552
num_examples: 8
- name: musician
num_bytes: 3552
num_examples: 8
- name: event_planner
num_bytes: 3552
num_examples: 8
- name: writer
num_bytes: 3504
num_examples: 6
- name: cook
num_bytes: 3600
num_examples: 10
- name: welder
num_bytes: 3504
num_examples: 6
- name: producer
num_bytes: 3528
num_examples: 7
- name: hairdresser
num_bytes: 3480
num_examples: 5
- name: farmer
num_bytes: 3504
num_examples: 6
- name: construction_worker
num_bytes: 3552
num_examples: 8
- name: air_conditioning_installer
num_bytes: 3480
num_examples: 5
- name: electrical_engineer
num_bytes: 3480
num_examples: 5
- name: occupational_therapist
num_bytes: 3504
num_examples: 6
- name: career_counselor
num_bytes: 3480
num_examples: 5
- name: interior_designer
num_bytes: 3552
num_examples: 8
- name: jailer
num_bytes: 3480
num_examples: 5
- name: office_clerk
num_bytes: 3480
num_examples: 5
- name: market_research_analyst
num_bytes: 3504
num_examples: 6
- name: laboratory_technician
num_bytes: 3504
num_examples: 6
- name: social_assistant
num_bytes: 3552
num_examples: 8
- name: medical_records_specialist
num_bytes: 3504
num_examples: 6
- name: machinery_mechanic
num_bytes: 3480
num_examples: 5
- name: police_officer
num_bytes: 3504
num_examples: 6
- name: software_developer
num_bytes: 3504
num_examples: 6
- name: clerk
num_bytes: 3504
num_examples: 6
- name: salesperson
num_bytes: 3552
num_examples: 8
- name: social_worker
num_bytes: 3552
num_examples: 8
- name: director
num_bytes: 3480
num_examples: 5
- name: fast_food_worker
num_bytes: 3576
num_examples: 9
- name: singer
num_bytes: 3576
num_examples: 9
- name: metal_worker
num_bytes: 3504
num_examples: 6
- name: cleaner
num_bytes: 3552
num_examples: 8
- name: computer_systems_analyst
num_bytes: 3528
num_examples: 7
- name: dental_assistant
num_bytes: 3480
num_examples: 5
- name: psychologist
num_bytes: 3480
num_examples: 5
- name: machinist
num_bytes: 3480
num_examples: 5
- name: therapist
num_bytes: 3480
num_examples: 5
- name: veterinarian
num_bytes: 3504
num_examples: 6
- name: teacher
num_bytes: 3504
num_examples: 6
- name: architect
num_bytes: 3480
num_examples: 5
- name: office_worker
num_bytes: 3504
num_examples: 6
- name: drywall_installer
num_bytes: 3504
num_examples: 6
- name: nutritionist
num_bytes: 3504
num_examples: 6
- name: librarian
num_bytes: 3480
num_examples: 5
- name: childcare_worker
num_bytes: 3480
num_examples: 5
- name: school_bus_driver
num_bytes: 3480
num_examples: 5
- name: file_clerk
num_bytes: 3504
num_examples: 6
- name: logistician
num_bytes: 3504
num_examples: 6
- name: scientist
num_bytes: 3480
num_examples: 5
- name: teaching_assistant
num_bytes: 3480
num_examples: 5
- name: radiologic_technician
num_bytes: 3480
num_examples: 5
- name: manicurist
num_bytes: 3552
num_examples: 8
- name: community_manager
num_bytes: 3528
num_examples: 7
- name: carpenter
num_bytes: 3504
num_examples: 6
- name: claims_appraiser
num_bytes: 3528
num_examples: 7
- name: dispatcher
num_bytes: 3528
num_examples: 7
- name: cashier
num_bytes: 3528
num_examples: 7
- name: roofer
num_bytes: 3528
num_examples: 7
- name: photographer
num_bytes: 3504
num_examples: 6
- name: detective
num_bytes: 3504
num_examples: 6
- name: financial_advisor
num_bytes: 3480
num_examples: 5
- name: wholesale_buyer
num_bytes: 3528
num_examples: 7
- name: it_specialist
num_bytes: 3480
num_examples: 5
- name: pharmacy_technician
num_bytes: 3504
num_examples: 6
- name: engineer
num_bytes: 3456
num_examples: 4
- name: mover
num_bytes: 3552
num_examples: 8
- name: plane_mechanic
num_bytes: 3456
num_examples: 4
- name: interviewer
num_bytes: 3528
num_examples: 7
- name: massage_therapist
num_bytes: 3528
num_examples: 7
- name: dishwasher
num_bytes: 3552
num_examples: 8
- name: fitness_instructor
num_bytes: 3528
num_examples: 7
- name: credit_counselor
num_bytes: 3504
num_examples: 6
- name: stocker
num_bytes: 3576
num_examples: 9
- name: pharmacist
num_bytes: 3456
num_examples: 4
- name: doctor
num_bytes: 3480
num_examples: 5
- name: compliance_officer
num_bytes: 3528
num_examples: 7
- name: aide
num_bytes: 3504
num_examples: 6
- name: bus_driver
num_bytes: 3528
num_examples: 7
- name: financial_analyst
num_bytes: 3480
num_examples: 5
- name: receptionist
num_bytes: 3504
num_examples: 6
- name: janitor
num_bytes: 3528
num_examples: 7
- name: plumber
num_bytes: 3480
num_examples: 5
- name: physical_therapist
num_bytes: 3504
num_examples: 6
- name: inventory_clerk
num_bytes: 3552
num_examples: 8
- name: firefighter
num_bytes: 3528
num_examples: 7
- name: coach
num_bytes: 3504
num_examples: 6
- name: maid
num_bytes: 3480
num_examples: 5
- name: pilot
num_bytes: 3480
num_examples: 5
- name: repair_worker
num_bytes: 3480
num_examples: 5
download_size: 864405
dataset_size: 512448
---
# Dataset Card for "prof_report__dalle-2__sd_21__12"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_KnutJaegersberg__Galactica-6.7B-EssayWriter | ---
pretty_name: Evaluation run of KnutJaegersberg/Galactica-6.7B-EssayWriter
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/Galactica-6.7B-EssayWriter](https://huggingface.co/KnutJaegersberg/Galactica-6.7B-EssayWriter)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Galactica-6.7B-EssayWriter\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-03T16:42:22.412540](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Galactica-6.7B-EssayWriter/blob/main/results_2023-12-03T16-42-22.412540.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.034874905231235785,\n\
\ \"acc_stderr\": 0.005053480765022248\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.034874905231235785,\n \"acc_stderr\": 0.005053480765022248\n\
\ }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/Galactica-6.7B-EssayWriter
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_03T16_42_22.412540
path:
- '**/details_harness|gsm8k|5_2023-12-03T16-42-22.412540.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-03T16-42-22.412540.parquet'
- config_name: results
data_files:
- split: 2023_12_03T16_42_22.412540
path:
- results_2023-12-03T16-42-22.412540.parquet
- split: latest
path:
- results_2023-12-03T16-42-22.412540.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/Galactica-6.7B-EssayWriter
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/Galactica-6.7B-EssayWriter
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Galactica-6.7B-EssayWriter](https://huggingface.co/KnutJaegersberg/Galactica-6.7B-EssayWriter) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Galactica-6.7B-EssayWriter",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T16:42:22.412540](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Galactica-6.7B-EssayWriter/blob/main/results_2023-12-03T16-42-22.412540.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.034874905231235785,
"acc_stderr": 0.005053480765022248
},
"harness|gsm8k|5": {
"acc": 0.034874905231235785,
"acc_stderr": 0.005053480765022248
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yardeny/tokenized_t5_small_context_len_64 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 10163799114
num_examples: 80462898
download_size: 3657002292
dataset_size: 10163799114
---
# Dataset Card for "tokenized_t5_small_context_len_64"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nerfgun3/john_kafka | ---
language:
- en
tags:
- stable-diffusion
- text-to-image
license: creativeml-openrail-m
inference: false
---
# John Kafka Artist Embedding / Textual Inversion
## Usage
To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
To use it in a prompt: ```"drawn by john_kafka"```
If it is to strong just add [] around it.
Trained until 6000 steps
Have fun :)
## Example Pictures
<table>
<tr>
<td><img src=https://i.imgur.com/aCnC1zv.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/FdBuWbG.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/1rkuXkZ.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/5N9Wp7q.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/v2AkXjU.png width=100% height=100%/></td>
</tr>
</table>
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
CyberHarem/dolla_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of dolla/ドラー/朵拉 (Nikke: Goddess of Victory)
This is the dataset of dolla/ドラー/朵拉 (Nikke: Goddess of Victory), containing 41 images and their tags.
The core tags of this character are `long_hair, purple_eyes, ponytail, breasts, bangs, earrings, purple_hair, large_breasts, ahoge, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 41 | 60.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dolla_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 41 | 30.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dolla_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 97 | 63.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dolla_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 41 | 51.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dolla_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 97 | 98.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dolla_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dolla_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, solo, white_shirt, black_gloves, long_sleeves, black_necktie, black_jacket, jewelry, looking_at_viewer, black_pants, collared_shirt, formal, holding, open_jacket, suit, blush, closed_mouth, black_choker, navel |
| 1 | 13 |  |  |  |  |  | 1girl, bare_shoulders, solo, black_gloves, looking_at_viewer, black_dress, thighs, halterneck, half_gloves, bracelet, cleavage, closed_mouth, hair_ornament, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | white_shirt | black_gloves | long_sleeves | black_necktie | black_jacket | jewelry | looking_at_viewer | black_pants | collared_shirt | formal | holding | open_jacket | suit | blush | closed_mouth | black_choker | navel | bare_shoulders | black_dress | thighs | halterneck | half_gloves | bracelet | cleavage | hair_ornament | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------|:---------------|:---------------|:----------------|:---------------|:----------|:--------------------|:--------------|:-----------------|:---------|:----------|:--------------|:-------|:--------|:---------------|:---------------|:--------|:-----------------|:--------------|:---------|:-------------|:--------------|:-----------|:-----------|:----------------|:-------------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | | X | | | | | X | | | | | | | | X | | | X | X | X | X | X | X | X | X | X |
|
gvlk/celebqav3 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1945935
num_examples: 870
download_size: 308641
dataset_size: 1945935
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_perlthoughts__Chupacabra-7B | ---
pretty_name: Evaluation run of perlthoughts/Chupacabra-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [perlthoughts/Chupacabra-7B](https://huggingface.co/perlthoughts/Chupacabra-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Chupacabra-7B\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-03T15:20:58.431709](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B/blob/main/results_2023-12-03T15-20-58.431709.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.621683093252464,\n\
\ \"acc_stderr\": 0.013358407831777112\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.621683093252464,\n \"acc_stderr\": 0.013358407831777112\n\
\ }\n}\n```"
repo_url: https://huggingface.co/perlthoughts/Chupacabra-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_03T15_20_58.431709
path:
- '**/details_harness|gsm8k|5_2023-12-03T15-20-58.431709.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-03T15-20-58.431709.parquet'
- config_name: results
data_files:
- split: 2023_12_03T15_20_58.431709
path:
- results_2023-12-03T15-20-58.431709.parquet
- split: latest
path:
- results_2023-12-03T15-20-58.431709.parquet
---
# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/perlthoughts/Chupacabra-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [perlthoughts/Chupacabra-7B](https://huggingface.co/perlthoughts/Chupacabra-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_perlthoughts__Chupacabra-7B",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T15:20:58.431709](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B/blob/main/results_2023-12-03T15-20-58.431709.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.621683093252464,
"acc_stderr": 0.013358407831777112
},
"harness|gsm8k|5": {
"acc": 0.621683093252464,
"acc_stderr": 0.013358407831777112
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
daze-unlv/medmcqa-alignment | ---
license: apache-2.0
---
|
yzhuang/metatree_fri_c1_1000_10 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 69300
num_examples: 693
- name: validation
num_bytes: 30700
num_examples: 307
download_size: 105285
dataset_size: 100000
---
# Dataset Card for "metatree_fri_c1_1000_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lca0503/amazon_tts_encodec | ---
dataset_info:
features:
- name: file_id
dtype: string
- name: instruction
dtype: string
- name: transcription
dtype: string
- name: src_encodec_0
sequence: int64
- name: src_encodec_1
sequence: int64
- name: src_encodec_2
sequence: int64
- name: src_encodec_3
sequence: int64
- name: src_encodec_4
sequence: int64
- name: src_encodec_5
sequence: int64
- name: src_encodec_6
sequence: int64
- name: src_encodec_7
sequence: int64
- name: tgt_encodec_0
sequence: int64
- name: tgt_encodec_1
sequence: int64
- name: tgt_encodec_2
sequence: int64
- name: tgt_encodec_3
sequence: int64
- name: tgt_encodec_4
sequence: int64
- name: tgt_encodec_5
sequence: int64
- name: tgt_encodec_6
sequence: int64
- name: tgt_encodec_7
sequence: int64
splits:
- name: train
num_bytes: 676568694
num_examples: 19143
download_size: 108921169
dataset_size: 676568694
---
# Dataset Card for "amazon_tts_encodec"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xswu/human_preference_dataset | ---
license: cc-by-4.0
---
|
CyberHarem/hata_no_kokoro_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hata_no_kokoro/秦こころ/하타노코코로 (Touhou)
This is the dataset of hata_no_kokoro/秦こころ/하타노코코로 (Touhou), containing 500 images and their tags.
The core tags of this character are `long_hair, pink_hair, pink_eyes, bow, very_long_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 711.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_no_kokoro_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 430.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_no_kokoro_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1210 | 878.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_no_kokoro_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 637.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_no_kokoro_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1210 | 1.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/hata_no_kokoro_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hata_no_kokoro_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, bubble_skirt, expressionless, folding_fan, long_sleeves, looking_at_viewer, noh_mask, plaid_shirt, solo, fox_mask, oni_mask |
| 1 | 5 |  |  |  |  |  | 1girl, bubble_skirt, expressionless, folding_fan, fox_mask, long_sleeves, looking_at_viewer, plaid_shirt, solo |
| 2 | 8 |  |  |  |  |  | 1girl, fox_mask, long_sleeves, looking_at_viewer, noh_mask, oni_mask, plaid_shirt, solo, bubble_skirt, expressionless, mouth_mask, wide_sleeves |
| 3 | 14 |  |  |  |  |  | 1girl, bubble_skirt, expressionless, fox_mask, long_sleeves, naginata, plaid_shirt, solo, looking_at_viewer, oni_mask, mouth_mask |
| 4 | 8 |  |  |  |  |  | 1girl, bubble_skirt, circle, closed_mouth, collared_shirt, long_sleeves, looking_at_viewer, plaid_shirt, solo, star_(symbol), triangle, buttons, green_shirt, hair_between_eyes, mask_on_head, orange_skirt, purple_bowtie, white_background, expressionless, fox_mask, simple_background, folding_fan, holding_fan, standing, blue_bowtie, pink_skirt |
| 5 | 9 |  |  |  |  |  | 1girl, closed_mouth, long_sleeves, looking_at_viewer, mask_on_head, plaid_shirt, solo, expressionless, fox_mask, hair_between_eyes, purple_bowtie, upper_body, green_shirt, collared_shirt, simple_background, star_(symbol), white_background |
| 6 | 9 |  |  |  |  |  | 1girl, long_sleeves, solo, wide_sleeves, alternate_costume, floral_print, looking_at_viewer, mask_on_head, blush, hair_between_eyes, obi, sidelocks, closed_mouth, holding, standing, expressionless, alternate_hairstyle, pink_kimono, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bubble_skirt | expressionless | folding_fan | long_sleeves | looking_at_viewer | noh_mask | plaid_shirt | solo | fox_mask | oni_mask | mouth_mask | wide_sleeves | naginata | circle | closed_mouth | collared_shirt | star_(symbol) | triangle | buttons | green_shirt | hair_between_eyes | mask_on_head | orange_skirt | purple_bowtie | white_background | simple_background | holding_fan | standing | blue_bowtie | pink_skirt | upper_body | alternate_costume | floral_print | blush | obi | sidelocks | holding | alternate_hairstyle | pink_kimono |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------------|:--------------|:---------------|:--------------------|:-----------|:--------------|:-------|:-----------|:-----------|:-------------|:---------------|:-----------|:---------|:---------------|:-----------------|:----------------|:-----------|:----------|:--------------|:--------------------|:---------------|:---------------|:----------------|:-------------------|:--------------------|:--------------|:-----------|:--------------|:-------------|:-------------|:--------------------|:---------------|:--------|:------|:------------|:----------|:----------------------|:--------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | X | X | X | | X | X | | X | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | X | | X | X | | X | X | X | | | | | | X | X | X | | | X | X | X | | X | X | X | | | | | X | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | X | | X | X | | | X | | | | X | | | X | | | | | | X | X | | | X | | | X | | | | X | X | X | X | X | X | X | X |
|
floworId/hallebailey | ---
license: other
---
|
joey234/mmlu-virology-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 59734
num_examples: 166
download_size: 40286
dataset_size: 59734
---
# Dataset Card for "mmlu-virology-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sngsfydy/aptos | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
splits:
- name: train
num_bytes: 6185316143.746
num_examples: 3662
download_size: 8874518024
dataset_size: 6185316143.746
---
# Dataset Card for "aptos"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
turingmachine/hupd-npe-balanced-same-year-same-class-subset | ---
dataset_info:
features:
- name: application_number
dtype: int64
- name: decision
dtype: string
- name: title
dtype: string
- name: abstract
dtype: string
- name: claims
dtype: string
- name: description
dtype: string
- name: background
dtype: string
- name: summary
dtype: string
- name: cpc_label
dtype: string
- name: filing_date
dtype: string
- name: patent_issue_date
dtype: string
- name: date_published
dtype: string
- name: examiner_id
dtype: string
- name: ipc_label
dtype: string
- name: npe_litigated_count
dtype: int64
- name: examiner_full_name
dtype: string
- name: invention_title
dtype: string
- name: small_entity_indicator
dtype: string
- name: continuation
dtype: int64
- name: decision_as_of_2020
dtype: string
- name: main_ipcr_label_subclass
dtype: string
- name: filing_year
dtype: int64
splits:
- name: train
num_bytes: 2640151720
num_examples: 33158
download_size: 1015448971
dataset_size: 2640151720
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/hina_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hina/空崎ヒナ/日奈 (Blue Archive)
This is the dataset of hina/空崎ヒナ/日奈 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `long_hair, horns, white_hair, purple_eyes, demon_horns, halo, parted_bangs, ahoge, wings, hair_ornament, demon_wings, hairclip, multiple_horns, demon_girl, very_long_hair, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.17 GiB | [Download](https://huggingface.co/datasets/CyberHarem/hina_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 962.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hina_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1378 | 2.00 GiB | [Download](https://huggingface.co/datasets/CyberHarem/hina_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hina_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, blue_one-piece_swimsuit, blush, official_alternate_costume, small_breasts, solo, whistle_around_neck, looking_at_viewer, name_tag, old_school_swimsuit, one_side_up, outdoors, swim_ring, blue_sky, cloud, innertube, wet, closed_mouth, collarbone, covered_navel, day, low_wings, ocean, water, bare_arms, beach, horizon, sitting |
| 1 | 11 |  |  |  |  |  | 1girl, innertube, looking_at_viewer, name_tag, official_alternate_costume, one_side_up, solo, swim_ring, collarbone, whistle_around_neck, old_school_swimsuit, water, blush, blue_one-piece_swimsuit, closed_mouth, low_wings, smile |
| 2 | 9 |  |  |  |  |  | 1girl, black_horns, blush, elbow_gloves, looking_at_viewer, necklace, official_alternate_costume, official_alternate_hairstyle, pendant, purple_dress, purple_gloves, solo, strapless_dress, collarbone, dangle_earrings, grand_piano, bare_shoulders, closed_mouth, grey_hair, piano_keys, smile, purple_wings |
| 3 | 14 |  |  |  |  |  | 1girl, black_horns, elbow_gloves, necklace, official_alternate_costume, official_alternate_hairstyle, pendant, purple_dress, purple_gloves, solo, strapless_dress, dangle_earrings, looking_at_viewer, bare_shoulders, blush, closed_mouth, collarbone, grey_hair, smile, upper_body |
| 4 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_horns, elbow_gloves, from_behind, looking_at_viewer, looking_back, necklace, official_alternate_costume, official_alternate_hairstyle, purple_dress, purple_gloves, solo, strapless_dress, blush, closed_mouth, dangle_earrings, simple_background, white_background, backless_dress, back_focus, grey_hair |
| 5 | 5 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, official_alternate_costume, polka_dot, solo, blush, hair_between_eyes, pink_pajamas, white_background, closed_mouth, simple_background, black_horns, lying, upper_body |
| 6 | 6 |  |  |  |  |  | 1girl, hair_between_eyes, long_sleeves, looking_at_viewer, official_alternate_costume, pink_pajamas, polka_dot, yellow_cardigan, blush, closed_mouth, jacket, open_cardigan, solo, sitting, black_horns, depth_of_field, indoors |
| 7 | 10 |  |  |  |  |  | 1girl, black_gloves, black_skirt, jacket, long_sleeves, looking_at_viewer, military_uniform, pencil_skirt, solo, belt, coat_on_shoulders, forehead, side_slit, black_thighhighs, armband, fur-trimmed_coat, zettai_ryouiki, black_coat, closed_mouth, simple_background, miniskirt, cowboy_shot, hand_on_own_hip, white_background |
| 8 | 13 |  |  |  |  |  | 1girl, black_skirt, forehead, long_sleeves, looking_at_viewer, machine_gun, mg42, military_uniform, pencil_skirt, side_slit, solo, black_gloves, coat_on_shoulders, fur-trimmed_coat, black_coat, belt, black_thighhighs, miniskirt, zettai_ryouiki, holding_gun, closed_mouth, armband, boots |
| 9 | 6 |  |  |  |  |  | 1girl, black_skirt, crossed_legs, long_sleeves, looking_at_viewer, military_uniform, sitting, solo, belt, black_gloves, black_thighhighs, coat_on_shoulders, forehead, knee_boots, pencil_skirt, black_footwear, jacket, closed_mouth, fur-trimmed_coat |
| 10 | 7 |  |  |  |  |  | 1girl, black_skirt, blush, looking_at_viewer, miniskirt, pencil_skirt, sleeveless_shirt, solo, white_shirt, black_thighhighs, side_slit, closed_mouth, collared_shirt, simple_background, small_breasts, white_background, armpits, forehead, frills, sitting, zettai_ryouiki, arms_up, bare_shoulders |
| 11 | 8 |  |  |  |  |  | 1girl, black_skirt, forehead, looking_at_viewer, pencil_skirt, sitting, sleeveless_shirt, solo, white_shirt, black_thighhighs, collared_shirt, bare_shoulders, miniskirt, zettai_ryouiki, blush, side_slit, bare_arms, depth_of_field, smile, closed_mouth, indoors, ponytail, purple_thighhighs, wavy_hair |
| 12 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, solo, maid_apron, blush, enmaided, maid_headdress, black_dress, simple_background, frilled_apron, white_apron, bowtie, closed_mouth, holding, puffy_short_sleeves, white_background, forehead, long_sleeves, white_thighhighs |
| 13 | 6 |  |  |  |  |  | 1girl, alternate_costume, blush, looking_at_viewer, outdoors, sleeveless_dress, solo, white_dress, bare_shoulders, sundress, closed_mouth, collarbone, hat, skirt_hold, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_one-piece_swimsuit | blush | official_alternate_costume | small_breasts | solo | whistle_around_neck | looking_at_viewer | name_tag | old_school_swimsuit | one_side_up | outdoors | swim_ring | blue_sky | cloud | innertube | wet | closed_mouth | collarbone | covered_navel | day | low_wings | ocean | water | bare_arms | beach | horizon | sitting | smile | black_horns | elbow_gloves | necklace | official_alternate_hairstyle | pendant | purple_dress | purple_gloves | strapless_dress | dangle_earrings | grand_piano | bare_shoulders | grey_hair | piano_keys | purple_wings | upper_body | from_behind | looking_back | simple_background | white_background | backless_dress | back_focus | long_sleeves | polka_dot | hair_between_eyes | pink_pajamas | lying | yellow_cardigan | jacket | open_cardigan | depth_of_field | indoors | black_gloves | black_skirt | military_uniform | pencil_skirt | belt | coat_on_shoulders | forehead | side_slit | black_thighhighs | armband | fur-trimmed_coat | zettai_ryouiki | black_coat | miniskirt | cowboy_shot | hand_on_own_hip | machine_gun | mg42 | holding_gun | boots | crossed_legs | knee_boots | black_footwear | sleeveless_shirt | white_shirt | collared_shirt | armpits | frills | arms_up | ponytail | purple_thighhighs | wavy_hair | maid_apron | enmaided | maid_headdress | black_dress | frilled_apron | white_apron | bowtie | holding | puffy_short_sleeves | white_thighhighs | alternate_costume | sleeveless_dress | white_dress | sundress | hat | skirt_hold |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------------------------|:--------|:-----------------------------|:----------------|:-------|:----------------------|:--------------------|:-----------|:----------------------|:--------------|:-----------|:------------|:-----------|:--------|:------------|:------|:---------------|:-------------|:----------------|:------|:------------|:--------|:--------|:------------|:--------|:----------|:----------|:--------|:--------------|:---------------|:-----------|:-------------------------------|:----------|:---------------|:----------------|:------------------|:------------------|:--------------|:-----------------|:------------|:-------------|:---------------|:-------------|:--------------|:---------------|:--------------------|:-------------------|:-----------------|:-------------|:---------------|:------------|:--------------------|:---------------|:--------|:------------------|:---------|:----------------|:-----------------|:----------|:---------------|:--------------|:-------------------|:---------------|:-------|:--------------------|:-----------|:------------|:-------------------|:----------|:-------------------|:-----------------|:-------------|:------------|:--------------|:------------------|:--------------|:-------|:--------------|:--------|:---------------|:-------------|:-----------------|:-------------------|:--------------|:-----------------|:----------|:---------|:----------|:-----------|:--------------------|:------------|:-------------|:-----------|:-----------------|:--------------|:----------------|:--------------|:---------|:----------|:----------------------|:-------------------|:--------------------|:-------------------|:--------------|:-----------|:------|:-------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | | X | X | X | X | X | X | | X | | | X | | X | X | | | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | | X | X | | X | | X | | | | | | | | | | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | X | | X | X | | X | | X | | | | | | | | | | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | X | | X | | X | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | | X | X | X | X | | X | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | X | | X | | X | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | X | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | X | X | | X | | X | | | | | | | | | | X | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 10 |  |  |  |  |  | X | | | | | X | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | X | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 13 |  |  |  |  |  | X | | | | | X | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 6 |  |  |  |  |  | X | | | | | X | | X | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | X | X | X | X | X | X | X | | X | | X | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 7 |  |  |  |  |  | X | | X | | X | X | | X | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | X | X | | | | | | | | | | | | | | X | | X | | | X | X | X | | | X | | X | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 11 | 8 |  |  |  |  |  | X | | X | | | X | | X | | | | | | | | | | X | | | | | | | X | | | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | | X | | X | | | X | X | X | | | X | | X | | | | | | | | | | X | X | X | | | | X | X | X | | | | | | | | | | | | | | | | |
| 12 | 10 |  |  |  |  |  | X | | X | | | X | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 13 | 6 |  |  |  |  |  | X | | X | | | X | | X | | | | X | | | | | | X | X | | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X |
|
jxm/subj | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1128835
num_examples: 8000
- name: test
num_bytes: 286215
num_examples: 2000
- name: dev
num_bytes: 37250
num_examples: 256
download_size: 960873
dataset_size: 1452300
---
# Dataset Card for "subj"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/hu_tao_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hu_tao/胡桃/胡桃 (Genshin Impact)
This is the dataset of hu_tao/胡桃/胡桃 (Genshin Impact), containing 500 images and their tags.
The core tags of this character are `long_hair, brown_hair, red_eyes, symbol-shaped_pupils, twintails, flower-shaped_pupils, very_long_hair, hair_between_eyes, hat, black_headwear, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:---------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.31 GiB | [Download](https://huggingface.co/datasets/CyberHarem/hu_tao_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 1.04 GiB | [Download](https://huggingface.co/datasets/CyberHarem/hu_tao_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1400 | 2.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/hu_tao_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hu_tao_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, alternate_costume, looking_at_viewer, solo, thighs, bare_shoulders, hair_flower, open_mouth, red_flower, china_dress, nail_polish, black_nails, blush, plum_blossoms, :d, pelvic_curtain, sleeveless_dress, cleavage, clothing_cutout, ghost, black_thighhighs, hand_up, medium_breasts, ring, simple_background, white_background, bare_arms, black_dress, covered_navel, cowboy_shot, red_dress, side_slit, sitting, small_breasts, tassel |
| 1 | 12 |  |  |  |  |  | 1girl, black_shorts, chinese_clothes, hat_flower, long_sleeves, looking_at_viewer, nail_polish, short_shorts, solo, black_nails, ghost, thighs, plum_blossoms, bead_bracelet, cowboy_shot, shirt, blush, grin, porkpie_hat, multiple_rings, coat, wide_sleeves |
| 2 | 5 |  |  |  |  |  | 1girl, black_nails, black_shorts, chinese_clothes, hat_flower, long_sleeves, looking_at_viewer, multiple_rings, plum_blossoms, porkpie_hat, shirt, smile, solo, thighs, nail_polish, short_shorts, white_socks, bead_bracelet, coattails, :q, blush, closed_mouth, ghost_pose, open_mouth, orange_eyes, red_flower |
| 3 | 15 |  |  |  |  |  | 1girl, alternate_costume, looking_at_viewer, pleated_skirt, solo, blush, smile, black_skirt, hair_flower, miniskirt, thighs, long_sleeves, black_nails, closed_mouth, red_flower, red_neckerchief, nail_polish, plum_blossoms, zettai_ryouiki, black_sailor_collar, black_serafuku, black_shirt, cowboy_shot, crop_top, ghost, midriff, tongue_out, white_shirt, multiple_rings, one_eye_closed, white_thighhighs |
| 4 | 6 |  |  |  |  |  | 1girl, black_dress, blush, enmaided, hair_flower, looking_at_viewer, maid_apron, maid_headdress, puffy_sleeves, solo, frilled_apron, white_apron, :d, bow, frilled_dress, ghost, long_sleeves, nail_polish, open_mouth, red_flower, short_sleeves, sidelocks, thighhighs |
| 5 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, navel, small_breasts, solo, stomach, smile, bare_shoulders, outdoors, black_bikini, blue_sky, day, thighs, blush, cloud, ocean, alternate_costume, armpits, frilled_bikini, hair_flower, water, beach, holding, thigh_strap |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | alternate_costume | looking_at_viewer | solo | thighs | bare_shoulders | hair_flower | open_mouth | red_flower | china_dress | nail_polish | black_nails | blush | plum_blossoms | :d | pelvic_curtain | sleeveless_dress | cleavage | clothing_cutout | ghost | black_thighhighs | hand_up | medium_breasts | ring | simple_background | white_background | bare_arms | black_dress | covered_navel | cowboy_shot | red_dress | side_slit | sitting | small_breasts | tassel | black_shorts | chinese_clothes | hat_flower | long_sleeves | short_shorts | bead_bracelet | shirt | grin | porkpie_hat | multiple_rings | coat | wide_sleeves | smile | white_socks | coattails | :q | closed_mouth | ghost_pose | orange_eyes | pleated_skirt | black_skirt | miniskirt | red_neckerchief | zettai_ryouiki | black_sailor_collar | black_serafuku | black_shirt | crop_top | midriff | tongue_out | white_shirt | one_eye_closed | white_thighhighs | enmaided | maid_apron | maid_headdress | puffy_sleeves | frilled_apron | white_apron | bow | frilled_dress | short_sleeves | sidelocks | thighhighs | navel | stomach | outdoors | black_bikini | blue_sky | day | cloud | ocean | armpits | frilled_bikini | water | beach | holding | thigh_strap |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------------------|:-------|:---------|:-----------------|:--------------|:-------------|:-------------|:--------------|:--------------|:--------------|:--------|:----------------|:-----|:-----------------|:-------------------|:-----------|:------------------|:--------|:-------------------|:----------|:-----------------|:-------|:--------------------|:-------------------|:------------|:--------------|:----------------|:--------------|:------------|:------------|:----------|:----------------|:---------|:---------------|:------------------|:-------------|:---------------|:---------------|:----------------|:--------|:-------|:--------------|:-----------------|:-------|:---------------|:--------|:--------------|:------------|:-----|:---------------|:-------------|:--------------|:----------------|:--------------|:------------|:------------------|:-----------------|:----------------------|:-----------------|:--------------|:-----------|:----------|:-------------|:--------------|:-----------------|:-------------------|:-----------|:-------------|:-----------------|:----------------|:----------------|:--------------|:------|:----------------|:----------------|:------------|:-------------|:--------|:----------|:-----------|:---------------|:-----------|:------|:--------|:--------|:----------|:-----------------|:--------|:--------|:----------|:--------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | | X | X | X | | | | | | X | X | X | X | | | | | | X | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | X | X | | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 15 |  |  |  |  |  | X | X | X | X | X | | X | | X | | X | X | X | X | | | | | | X | | | | | | | | | | X | | | | | | | | | X | | | | | | X | | | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | X | X | | | X | X | X | | X | | X | | X | | | | | X | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 5 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.