datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
KenDoStudio/Cassidy-Golden_Freddy | ---
license: mit
---
|
Emrekrtlus/deneme | ---
task_categories:
- text-classification
language:
- tr
tags:
- cyber bullying
--- |
linhqyy/result_with_finetuned_taggenv2_10epoch_encoder_embeddings_decoder_roberta | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
- name: w2v2_baseline_transcription
dtype: string
- name: w2v2_baseline_norm
dtype: string
splits:
- name: train
num_bytes: 174371675.027
num_examples: 1299
download_size: 164200911
dataset_size: 174371675.027
---
# Dataset Card for "result_with_finetuned_taggenv2_10epoch_encoder_embeddings_decoder_roberta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
maxmyn/wholesome_simple_greentext_133k | ---
dataset_info:
features:
- name: greentexts
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 17090474
num_examples: 133442
download_size: 10465468
dataset_size: 17090474
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nielsr/FUNSD_layoutlmv2 | ---
language:
- en
paperswithcode_id: funsd
---
# Dataset Card for "FUNSD"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
### Dataset Summary
The [FUNSD](https://guillaumejaume.github.io/FUNSD/) dataset, with one difference compared to the original dataset, each document image is resized to 224x224.
The FUNSD dataset is a collection of annotated forms.
This dataset loading script is taken from the [official LayoutLMv2 implementation](https://github.com/microsoft/unilm/blob/master/layoutlmft/layoutlmft/data/datasets/funsd.py), and updated to not include any Detectron2 dependencies.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
We show detailed information for up to 5 configurations of the dataset.
### Data Instances
#### conll2000
- **Size of downloaded dataset files:** 3.32 MB
- **Size of the generated dataset:** 6.25 MB
- **Total amount of disk used:** 9.57 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"chunk_tags": [11, 13, 11, 12, 21, 22, 22, 22, 22, 11, 12, 12, 17, 11, 12, 13, 11, 0, 1, 13, 11, 11, 0, 21, 22, 22, 11, 12, 12, 13, 11, 12, 12, 11, 12, 12, 0],
"id": "0",
"pos_tags": [19, 14, 11, 19, 39, 27, 37, 32, 34, 11, 15, 19, 14, 19, 22, 14, 20, 5, 15, 14, 19, 19, 5, 34, 32, 34, 11, 15, 19, 14, 20, 9, 20, 24, 15, 22, 6],
"tokens": "[\"Confidence\", \"in\", \"the\", \"pound\", \"is\", \"widely\", \"expected\", \"to\", \"take\", \"another\", \"sharp\", \"dive\", \"if\", \"trade\", \"figur..."
}
```
### Data Fields
The data fields are the same among all splits.
### Data Splits
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{DBLP:journals/corr/abs-1905-13538,
author = {Guillaume Jaume and
Hazim Kemal Ekenel and
Jean{-}Philippe Thiran},
title = {{FUNSD:} {A} Dataset for Form Understanding in Noisy Scanned Documents},
journal = {CoRR},
volume = {abs/1905.13538},
year = {2019},
url = {http://arxiv.org/abs/1905.13538},
archivePrefix = {arXiv},
eprint = {1905.13538},
timestamp = {Mon, 03 Jun 2019 13:42:33 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-1905-13538.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
### Contributions
Thanks to [@vblagoje](https://github.com/vblagoje), [@jplu](https://github.com/jplu) for adding this dataset. |
BrainGPT/BrainBench_GPT-4_v0.1.csv | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: doi
dtype: string
- name: journal_section
dtype: string
- name: original_abstract
dtype: string
- name: incorrect_abstract
dtype: string
splits:
- name: train
num_bytes: 314072
num_examples: 100
download_size: 186330
dataset_size: 314072
---
# What is BrainBench?
BrainBench is a forward-looking benchmark for neuroscience. BrainBench evaluates test-takers' ability to predict neuroscience results.
# What is BrainBench made of?
BrainBench's test cases were sourced from recent *Journal of Neuroscience* abstracts across five neuroscience domains:
Behavioral/Cognitive, Systems/Circuits, Neurobiology of Disease, Cellular/Molecular, and Developmental/Plasticity/Repair.
Test-takers chose between the original abstract and one altered to significantly change the result while maintaining coherency.
# How is BrainBench applied?
Human experts and Language Models (LLMs) were tasked with selecting the correct (i.e., original) version from the two options.
Human experts made choices, and provided confidence and expertise ratings in an online study.
LLMs were scored as choosing the abstract with the lower perplexity (i.e., the text passage that was less surprising to the model) and their confidence was proportional to the difference in perplexity between the two options.
***BrainBench_GPT-4_v0.1.csv** was generated by GPT-4 (Azure OpenAI API; version 2023-05-15). |
zche318/microstructure_porosity_periodic | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1924418.0
num_examples: 680
download_size: 1931686
dataset_size: 1924418.0
---
# Dataset Card for "microstructure_porosity_periodic"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TeamSODA/mcl-signal_processing_attacks_whisper_librispeech | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': 0-benign
'1': 1-kenan
'2': 2-yeehaw
'3': 3-imaginary_clipping
splits:
- name: train
num_bytes: 9472066083.0
num_examples: 12000
download_size: 8061059411
dataset_size: 9472066083.0
---
# Dataset Card for "mcl-signal_processing_attacks_large"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
movie_rationales | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- sentiment-classification
pretty_name: MovieRationales
dataset_info:
features:
- name: review
dtype: string
- name: label
dtype:
class_label:
names:
'0': NEG
'1': POS
- name: evidences
sequence: string
splits:
- name: test
num_bytes: 1046377
num_examples: 199
- name: train
num_bytes: 6853624
num_examples: 1600
- name: validation
num_bytes: 830417
num_examples: 200
download_size: 3899487
dataset_size: 8730418
---
# Dataset Card for "movie_rationales"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** https://github.com/jayded/eraserbenchmark
- **Paper:** [ERASER: A Benchmark to Evaluate Rationalized NLP Models](https://aclanthology.org/2020.acl-main.408/)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 3.90 MB
- **Size of the generated dataset:** 8.73 MB
- **Total amount of disk used:** 12.62 MB
### Dataset Summary
The movie rationale dataset contains human annotated rationales for movie
reviews.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 3.90 MB
- **Size of the generated dataset:** 8.73 MB
- **Total amount of disk used:** 12.62 MB
An example of 'validation' looks as follows.
```
{
"evidences": ["Fun movie"],
"label": 1,
"review": "Fun movie\n"
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `review`: a `string` feature.
- `label`: a classification label, with possible values including `NEG` (0), `POS` (1).
- `evidences`: a `list` of `string` features.
### Data Splits
| name |train|validation|test|
|-------|----:|---------:|---:|
|default| 1600| 200| 199|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{deyoung-etal-2020-eraser,
title = "{ERASER}: {A} Benchmark to Evaluate Rationalized {NLP} Models",
author = "DeYoung, Jay and
Jain, Sarthak and
Rajani, Nazneen Fatema and
Lehman, Eric and
Xiong, Caiming and
Socher, Richard and
Wallace, Byron C.",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.acl-main.408",
doi = "10.18653/v1/2020.acl-main.408",
pages = "4443--4458",
}
@InProceedings{zaidan-eisner-piatko-2008:nips,
author = {Omar F. Zaidan and Jason Eisner and Christine Piatko},
title = {Machine Learning with Annotator Rationales to Reduce Annotation Cost},
booktitle = {Proceedings of the NIPS*2008 Workshop on Cost Sensitive Learning},
month = {December},
year = {2008}
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@patrickvonplaten](https://github.com/patrickvonplaten), [@lewtun](https://github.com/lewtun) for adding this dataset. |
Codec-SUPERB/mridangam_extract_unit | ---
configs:
- config_name: default
data_files:
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k
path: data/encodec_24k-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: academicodec_hifi_16k_320d
num_bytes: 9307086
num_examples: 6977
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 9307086
num_examples: 6977
- name: academicodec_hifi_24k_320d
num_bytes: 13772366
num_examples: 6977
- name: audiodec_24k_320d
num_bytes: 29512478
num_examples: 6977
- name: dac_16k
num_bytes: 28061262
num_examples: 6977
- name: dac_24k
num_bytes: 110110782
num_examples: 6977
- name: dac_44k
num_bytes: 35680146
num_examples: 6977
- name: encodec_24k
num_bytes: 7130262
num_examples: 6977
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 74388542
num_examples: 6977
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 74388542
num_examples: 6977
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 74388542
num_examples: 6977
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 38666302
num_examples: 6977
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 74388542
num_examples: 6977
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 74388542
num_examples: 6977
- name: speech_tokenizer_16k
num_bytes: 18795806
num_examples: 6977
download_size: 98187324
dataset_size: 672286286
---
# Dataset Card for "mridangam_extract_unit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_cuidada__Hua-v0.1 | ---
pretty_name: Evaluation run of cuidada/Hua-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cuidada/Hua-v0.1](https://huggingface.co/cuidada/Hua-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cuidada__Hua-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-16T00:06:09.531722](https://huggingface.co/datasets/open-llm-leaderboard/details_cuidada__Hua-v0.1/blob/main/results_2024-04-16T00-06-09.531722.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4415238153037838,\n\
\ \"acc_stderr\": 0.03466565163116275,\n \"acc_norm\": 0.44601641335773506,\n\
\ \"acc_norm_stderr\": 0.03542078835760482,\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839672,\n \"mc2\": 0.43175858802279954,\n\
\ \"mc2_stderr\": 0.014663520808365601\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4112627986348123,\n \"acc_stderr\": 0.014379441068522077,\n\
\ \"acc_norm\": 0.4462457337883959,\n \"acc_norm_stderr\": 0.014526705548539982\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4843656642103167,\n\
\ \"acc_stderr\": 0.004987341485856657,\n \"acc_norm\": 0.6652061342362079,\n\
\ \"acc_norm_stderr\": 0.004709538864916341\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.4,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04046336883978251,\n\
\ \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04046336883978251\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5207547169811321,\n \"acc_stderr\": 0.030746349975723456,\n\
\ \"acc_norm\": 0.5207547169811321,\n \"acc_norm_stderr\": 0.030746349975723456\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4513888888888889,\n\
\ \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.4513888888888889,\n\
\ \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3988439306358382,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.3988439306358382,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224469,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224469\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.02418049716437689,\n \"acc_norm\"\
: 0.328042328042328,\n \"acc_norm_stderr\": 0.02418049716437689\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4870967741935484,\n \"acc_stderr\": 0.028434533152681855,\n \"\
acc_norm\": 0.4870967741935484,\n \"acc_norm_stderr\": 0.028434533152681855\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n \"\
acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.03872592983524754,\n\
\ \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.03872592983524754\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5707070707070707,\n \"acc_stderr\": 0.03526552724601199,\n \"\
acc_norm\": 0.5707070707070707,\n \"acc_norm_stderr\": 0.03526552724601199\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5077720207253886,\n \"acc_stderr\": 0.03608003225569654,\n\
\ \"acc_norm\": 0.5077720207253886,\n \"acc_norm_stderr\": 0.03608003225569654\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44358974358974357,\n \"acc_stderr\": 0.025189149894764198,\n\
\ \"acc_norm\": 0.44358974358974357,\n \"acc_norm_stderr\": 0.025189149894764198\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.453781512605042,\n \"acc_stderr\": 0.032339434681820885,\n \
\ \"acc_norm\": 0.453781512605042,\n \"acc_norm_stderr\": 0.032339434681820885\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5669724770642202,\n \"acc_stderr\": 0.02124414656907434,\n \"\
acc_norm\": 0.5669724770642202,\n \"acc_norm_stderr\": 0.02124414656907434\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3287037037037037,\n \"acc_stderr\": 0.032036140846700596,\n \"\
acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.032036140846700596\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.47058823529411764,\n \"acc_stderr\": 0.03503235296367992,\n \"\
acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03503235296367992\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5780590717299579,\n \"acc_stderr\": 0.032148146302403695,\n \
\ \"acc_norm\": 0.5780590717299579,\n \"acc_norm_stderr\": 0.032148146302403695\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.47085201793721976,\n\
\ \"acc_stderr\": 0.03350073248773404,\n \"acc_norm\": 0.47085201793721976,\n\
\ \"acc_norm_stderr\": 0.03350073248773404\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48091603053435117,\n \"acc_stderr\": 0.04382094705550988,\n\
\ \"acc_norm\": 0.48091603053435117,\n \"acc_norm_stderr\": 0.04382094705550988\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4723926380368098,\n \"acc_stderr\": 0.039223782906109894,\n\
\ \"acc_norm\": 0.4723926380368098,\n \"acc_norm_stderr\": 0.039223782906109894\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.0482572933735639,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.0482572933735639\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.029343114798094472,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.029343114798094472\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5862068965517241,\n\
\ \"acc_stderr\": 0.017612204084663772,\n \"acc_norm\": 0.5862068965517241,\n\
\ \"acc_norm_stderr\": 0.017612204084663772\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.430635838150289,\n \"acc_stderr\": 0.02665880027367237,\n\
\ \"acc_norm\": 0.430635838150289,\n \"acc_norm_stderr\": 0.02665880027367237\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961459,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961459\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.028607893699576066,\n\
\ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.028607893699576066\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.49517684887459806,\n\
\ \"acc_stderr\": 0.028396770444111298,\n \"acc_norm\": 0.49517684887459806,\n\
\ \"acc_norm_stderr\": 0.028396770444111298\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4783950617283951,\n \"acc_stderr\": 0.027794760105008736,\n\
\ \"acc_norm\": 0.4783950617283951,\n \"acc_norm_stderr\": 0.027794760105008736\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.33687943262411346,\n \"acc_stderr\": 0.02819553487396673,\n \
\ \"acc_norm\": 0.33687943262411346,\n \"acc_norm_stderr\": 0.02819553487396673\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3389830508474576,\n\
\ \"acc_stderr\": 0.012089941857584476,\n \"acc_norm\": 0.3389830508474576,\n\
\ \"acc_norm_stderr\": 0.012089941857584476\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329383,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329383\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.41830065359477125,\n \"acc_stderr\": 0.019955975145835546,\n \
\ \"acc_norm\": 0.41830065359477125,\n \"acc_norm_stderr\": 0.019955975145835546\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.39591836734693875,\n \"acc_stderr\": 0.03130802899065685,\n\
\ \"acc_norm\": 0.39591836734693875,\n \"acc_norm_stderr\": 0.03130802899065685\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5323383084577115,\n\
\ \"acc_stderr\": 0.03528131472933607,\n \"acc_norm\": 0.5323383084577115,\n\
\ \"acc_norm_stderr\": 0.03528131472933607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.03834234744164993,\n\
\ \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.03834234744164993\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.01578537085839672,\n \"mc2\": 0.43175858802279954,\n\
\ \"mc2_stderr\": 0.014663520808365601\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6306235201262825,\n \"acc_stderr\": 0.013564470596053512\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20318423047763456,\n \
\ \"acc_stderr\": 0.011083227665267797\n }\n}\n```"
repo_url: https://huggingface.co/cuidada/Hua-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|arc:challenge|25_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|gsm8k|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hellaswag|10_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-16T00-06-09.531722.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-16T00-06-09.531722.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- '**/details_harness|winogrande|5_2024-04-16T00-06-09.531722.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-16T00-06-09.531722.parquet'
- config_name: results
data_files:
- split: 2024_04_16T00_06_09.531722
path:
- results_2024-04-16T00-06-09.531722.parquet
- split: latest
path:
- results_2024-04-16T00-06-09.531722.parquet
---
# Dataset Card for Evaluation run of cuidada/Hua-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cuidada/Hua-v0.1](https://huggingface.co/cuidada/Hua-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cuidada__Hua-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-16T00:06:09.531722](https://huggingface.co/datasets/open-llm-leaderboard/details_cuidada__Hua-v0.1/blob/main/results_2024-04-16T00-06-09.531722.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4415238153037838,
"acc_stderr": 0.03466565163116275,
"acc_norm": 0.44601641335773506,
"acc_norm_stderr": 0.03542078835760482,
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839672,
"mc2": 0.43175858802279954,
"mc2_stderr": 0.014663520808365601
},
"harness|arc:challenge|25": {
"acc": 0.4112627986348123,
"acc_stderr": 0.014379441068522077,
"acc_norm": 0.4462457337883959,
"acc_norm_stderr": 0.014526705548539982
},
"harness|hellaswag|10": {
"acc": 0.4843656642103167,
"acc_stderr": 0.004987341485856657,
"acc_norm": 0.6652061342362079,
"acc_norm_stderr": 0.004709538864916341
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5207547169811321,
"acc_stderr": 0.030746349975723456,
"acc_norm": 0.5207547169811321,
"acc_norm_stderr": 0.030746349975723456
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4513888888888889,
"acc_stderr": 0.041614023984032786,
"acc_norm": 0.4513888888888889,
"acc_norm_stderr": 0.041614023984032786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224469,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224469
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.02418049716437689,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.02418049716437689
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4870967741935484,
"acc_stderr": 0.028434533152681855,
"acc_norm": 0.4870967741935484,
"acc_norm_stderr": 0.028434533152681855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.03872592983524754,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.03872592983524754
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5707070707070707,
"acc_stderr": 0.03526552724601199,
"acc_norm": 0.5707070707070707,
"acc_norm_stderr": 0.03526552724601199
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5077720207253886,
"acc_stderr": 0.03608003225569654,
"acc_norm": 0.5077720207253886,
"acc_norm_stderr": 0.03608003225569654
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44358974358974357,
"acc_stderr": 0.025189149894764198,
"acc_norm": 0.44358974358974357,
"acc_norm_stderr": 0.025189149894764198
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.453781512605042,
"acc_stderr": 0.032339434681820885,
"acc_norm": 0.453781512605042,
"acc_norm_stderr": 0.032339434681820885
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.035118075718047245,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.035118075718047245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5669724770642202,
"acc_stderr": 0.02124414656907434,
"acc_norm": 0.5669724770642202,
"acc_norm_stderr": 0.02124414656907434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.032036140846700596,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.032036140846700596
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03503235296367992,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03503235296367992
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5780590717299579,
"acc_stderr": 0.032148146302403695,
"acc_norm": 0.5780590717299579,
"acc_norm_stderr": 0.032148146302403695
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.47085201793721976,
"acc_stderr": 0.03350073248773404,
"acc_norm": 0.47085201793721976,
"acc_norm_stderr": 0.03350073248773404
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48091603053435117,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.48091603053435117,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4723926380368098,
"acc_stderr": 0.039223782906109894,
"acc_norm": 0.4723926380368098,
"acc_norm_stderr": 0.039223782906109894
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.0482572933735639,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.0482572933735639
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.029343114798094472,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.029343114798094472
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.017612204084663772,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.017612204084663772
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.430635838150289,
"acc_stderr": 0.02665880027367237,
"acc_norm": 0.430635838150289,
"acc_norm_stderr": 0.02665880027367237
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961459,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961459
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.028607893699576066,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.028607893699576066
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.49517684887459806,
"acc_stderr": 0.028396770444111298,
"acc_norm": 0.49517684887459806,
"acc_norm_stderr": 0.028396770444111298
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4783950617283951,
"acc_stderr": 0.027794760105008736,
"acc_norm": 0.4783950617283951,
"acc_norm_stderr": 0.027794760105008736
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.33687943262411346,
"acc_stderr": 0.02819553487396673,
"acc_norm": 0.33687943262411346,
"acc_norm_stderr": 0.02819553487396673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3389830508474576,
"acc_stderr": 0.012089941857584476,
"acc_norm": 0.3389830508474576,
"acc_norm_stderr": 0.012089941857584476
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329383,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329383
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.41830065359477125,
"acc_stderr": 0.019955975145835546,
"acc_norm": 0.41830065359477125,
"acc_norm_stderr": 0.019955975145835546
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39591836734693875,
"acc_stderr": 0.03130802899065685,
"acc_norm": 0.39591836734693875,
"acc_norm_stderr": 0.03130802899065685
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5323383084577115,
"acc_stderr": 0.03528131472933607,
"acc_norm": 0.5323383084577115,
"acc_norm_stderr": 0.03528131472933607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.03834234744164993,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.03834234744164993
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2839657282741738,
"mc1_stderr": 0.01578537085839672,
"mc2": 0.43175858802279954,
"mc2_stderr": 0.014663520808365601
},
"harness|winogrande|5": {
"acc": 0.6306235201262825,
"acc_stderr": 0.013564470596053512
},
"harness|gsm8k|5": {
"acc": 0.20318423047763456,
"acc_stderr": 0.011083227665267797
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AdapterOcean/code_instructions_standardized_cluster_16_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 15443233
num_examples: 8165
download_size: 6991757
dataset_size: 15443233
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_16_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dosa777/data_kmslab | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 22692346
num_examples: 66190
download_size: 7943416
dataset_size: 22692346
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
RoryLiu19/prapare_dataset_slide | ---
dataset_info:
features:
- name: repo_id
dtype: string
- name: file_path
dtype: string
- name: content
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 2281651
num_examples: 1919
download_size: 249872
dataset_size: 2281651
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
figfig/restaurant_order_local_test | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 270680.0
num_examples: 2
- name: test
num_bytes: 270680.0
num_examples: 2
download_size: 272201
dataset_size: 541360.0
---
# Dataset Card for "restaurant_order_local_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wentingzhao/redpajama-test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 76680496
num_examples: 1028
download_size: 44812690
dataset_size: 76680496
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
A subset of RedPajama that has been explicitly checked for overlaps with RedPajama-Data-1T-Sample, so one could use this for evaluation if RedPajama-Data-1T-Sample were the training data.
|
kb-kim/Enhanced_Scene_Graph_Generation_Datasets | ---
license: unknown
---
|
QNN/autotrain-data-auto2 | ---
language:
- en
task_categories:
- token-classification
---
# AutoTrain Dataset for project: auto2
## Dataset Description
This dataset has been automatically processed by AutoTrain for project auto2.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"tokens": [
"Pd",
"has",
"been",
"regarded",
"as",
"one",
"of",
"the",
"alternatives",
"to",
"Pt",
"as",
"a",
"promising",
"hydrogen",
"evolution",
"reaction",
"(HER)",
"catalyst.",
"Strategies",
"including",
"Pd-metal",
"alloys",
"(Pd-M)",
"and",
"Pd",
"hydrides",
"(PdH<sub><i>x</i></sub>)",
"have",
"been",
"proposed",
"to",
"boost",
"HER",
"performances.",
"However,",
"the",
"stability",
"issues,",
"e.g.,",
"the",
"dissolution",
"in",
"Pd-M",
"and",
"the",
"hydrogen",
"releasing",
"in",
"PdH<sub><i>x</i></sub>,",
"restrict",
"the",
"industrial",
"application",
"of",
"Pd-based",
"HER",
"catalysts.",
"We",
"here",
"design",
"and",
"synthesize",
"a",
"stable",
"Pd-Cu",
"hydride",
"(",
"PdCu<sub>0.2</sub>H<sub>0.43</sub>",
")",
"catalyst,",
"combining",
"the",
"advantages",
"of",
"both",
"Pd-M",
"and",
"PdH<sub><i>x</i></sub>",
"structures",
"and",
"improving",
"the",
"HER",
"durability",
"simultaneously.",
"The",
"hydrogen",
"intercalation",
"is",
"realized",
"under",
"atmospheric",
"pressure",
"(1.0",
"atm)",
"following",
"our",
"synthetic",
"approach",
"that",
"imparts",
"high",
"stability",
"to",
"the",
"Pd-Cu",
"hydride",
"structure.",
"The",
"obtained",
"PdCu<sub>0.2</sub>H<sub>0.43</sub>",
"catalyst",
"exhibits",
"a",
"small",
"overpotential",
"of",
"28",
"mV",
"at",
"10",
"mA/cm<sup>2</sup>",
",",
"a",
"low",
"Tafel",
"slope",
"of",
"23",
"mV/dec",
",",
"and",
"excellent",
"HER",
"durability",
"due",
"to",
"its",
"appropriate",
"hydrogen",
"adsorption",
"free",
"energy",
"and",
"alleviated",
"metal",
"dissolution",
"rate.",
"</p>",
"<p>"
],
"tags": [
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
0,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
0,
2,
2,
2,
2,
4,
2,
5,
5,
2,
5,
5,
2,
2,
2,
4,
2,
2,
5,
5,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2
]
},
{
"tokens": [
"A",
"critical",
"challenge",
"in",
"energy",
"research",
"is",
"the",
"development",
"of",
"earth",
"abundant",
"and",
"cost-effective",
"materials",
"that",
"catalyze",
"the",
"electrochemical",
"splitting",
"of",
"water",
"into",
"hydrogen",
"and",
"oxygen",
"at",
"high",
"rates",
"and",
"low",
"overpotentials.",
"Key",
"to",
"addressing",
"this",
"issue",
"lies",
"not",
"only",
"in",
"the",
"synthesis",
"of",
"new",
"materials,",
"but",
"also",
"in",
"the",
"elucidation",
"of",
"their",
"active",
"sites,",
"their",
"structure",
"under",
"operating",
"conditions",
"and",
"ultimately,",
"extraction",
"of",
"the",
"structure-function",
"relationships",
"used",
"to",
"spearhead",
"the",
"next",
"generation",
"of",
"catalyst",
"development.",
"In",
"this",
"work,",
"we",
"present",
"a",
"complete",
"cycle",
"of",
"synthesis,",
"operando",
"characterization,",
"and",
"redesign",
"of",
"an",
"amorphous",
"cobalt",
"phosphide",
"(",
"CoP",
"<sub><i>x</i></sub>",
")",
"bifunctional",
"catalyst.",
"The",
"research",
"was",
"driven",
"by",
"integrated",
"electrochemical",
"analysis,",
"Raman",
"spectroscopy",
"and",
"gravimetric",
"measurements",
"utilizing",
"a",
"novel",
"quartz",
"crystal",
"microbalance",
"spectroelectrochemical",
"cell",
"to",
"uncover",
"the",
"catalytically",
"active",
"species",
"of",
"amorphous",
"CoP",
"<sub><i>x</i></sub>",
"and",
"subsequently",
"modify",
"the",
"material",
"to",
"enhance",
"the",
"activity",
"of",
"the",
"elucidated",
"catalytic",
"phases.",
"Illustrating",
"the",
"power",
"of",
"our",
"approach,",
"the",
"second",
"generation",
"cobalt-iron",
"phosphide",
"(",
"CoFeP<sub>x</sub>",
")",
"catalyst,",
"developed",
"through",
"an",
"iteration",
"of",
"the",
"operando",
"measurement",
"directed",
"optimization",
"cycle,",
"is",
"superior",
"in",
"both",
"hydrogen",
"and",
"oxygen",
"evolution",
"reactivity",
"over",
"the",
"previous",
"material",
"and",
"is",
"capable",
"of",
"overall",
"water",
"electrolysis",
"at",
"a",
"current",
"density",
"of",
"10",
"mA",
"cm<sup>-2</sup>",
"with",
"1.5",
"V",
"applied",
"bias",
"in",
"1",
"M",
"KOH",
"electrolyte",
"solution.",
"</p>",
"<p>"
],
"tags": [
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
0,
0,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
0,
0,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
0,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
4,
4,
2,
5,
5,
5,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2,
2
]
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"tokens": "Sequence(feature=Value(dtype='string', id=None), length=-1, id=None)",
"tags": "Sequence(feature=ClassLabel(names=['CATALYST', 'CO-CATALYST', 'O', 'Other', 'PROPERTY_NAME', 'PROPERTY_VALUE'], id=None), length=-1, id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 166 |
| valid | 44 |
|
Multimodal-Fatima/VQAv2_validation_google_flan_t5_xxl_mode_VQAv2_visclues_detection_ns_100_open_ended | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_bs_16
num_bytes: 13464
num_examples: 100
download_size: 7220
dataset_size: 13464
---
# Dataset Card for "VQAv2_validation_google_flan_t5_xxl_mode_VQAv2_visclues_detection_ns_100_open_ended"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jsfs11__TurdusTrixBeagle-DARETIES-7B | ---
pretty_name: Evaluation run of jsfs11/TurdusTrixBeagle-DARETIES-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jsfs11/TurdusTrixBeagle-DARETIES-7B](https://huggingface.co/jsfs11/TurdusTrixBeagle-DARETIES-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jsfs11__TurdusTrixBeagle-DARETIES-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-24T06:52:34.475524](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__TurdusTrixBeagle-DARETIES-7B/blob/main/results_2024-01-24T06-52-34.475524.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.655345770405219,\n\
\ \"acc_stderr\": 0.032004831458594445,\n \"acc_norm\": 0.6544154239232413,\n\
\ \"acc_norm_stderr\": 0.03267916416687105,\n \"mc1\": 0.5593635250917993,\n\
\ \"mc1_stderr\": 0.017379697555437442,\n \"mc2\": 0.6881243184665276,\n\
\ \"mc2_stderr\": 0.015188166386714394\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.71160409556314,\n \"acc_stderr\": 0.013238394422428173,\n\
\ \"acc_norm\": 0.734641638225256,\n \"acc_norm_stderr\": 0.012902554762313962\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7202748456482773,\n\
\ \"acc_stderr\": 0.0044794676194648,\n \"acc_norm\": 0.8860784704242183,\n\
\ \"acc_norm_stderr\": 0.0031706661225176552\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7986111111111112,\n\
\ \"acc_stderr\": 0.03353647469713839,\n \"acc_norm\": 0.7986111111111112,\n\
\ \"acc_norm_stderr\": 0.03353647469713839\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138208,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138208\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.023025899617188723,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.023025899617188723\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092444,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092444\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834834,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834834\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n\
\ \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n\
\ \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.01275285834653313,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.01275285834653313\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000325,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000325\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5593635250917993,\n\
\ \"mc1_stderr\": 0.017379697555437442,\n \"mc2\": 0.6881243184665276,\n\
\ \"mc2_stderr\": 0.015188166386714394\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8516179952644041,\n \"acc_stderr\": 0.009990706005184135\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \
\ \"acc_stderr\": 0.012588685966624184\n }\n}\n```"
repo_url: https://huggingface.co/jsfs11/TurdusTrixBeagle-DARETIES-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|arc:challenge|25_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|gsm8k|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hellaswag|10_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T06-52-34.475524.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T06-52-34.475524.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- '**/details_harness|winogrande|5_2024-01-24T06-52-34.475524.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-24T06-52-34.475524.parquet'
- config_name: results
data_files:
- split: 2024_01_24T06_52_34.475524
path:
- results_2024-01-24T06-52-34.475524.parquet
- split: latest
path:
- results_2024-01-24T06-52-34.475524.parquet
---
# Dataset Card for Evaluation run of jsfs11/TurdusTrixBeagle-DARETIES-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jsfs11/TurdusTrixBeagle-DARETIES-7B](https://huggingface.co/jsfs11/TurdusTrixBeagle-DARETIES-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jsfs11__TurdusTrixBeagle-DARETIES-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T06:52:34.475524](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__TurdusTrixBeagle-DARETIES-7B/blob/main/results_2024-01-24T06-52-34.475524.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.655345770405219,
"acc_stderr": 0.032004831458594445,
"acc_norm": 0.6544154239232413,
"acc_norm_stderr": 0.03267916416687105,
"mc1": 0.5593635250917993,
"mc1_stderr": 0.017379697555437442,
"mc2": 0.6881243184665276,
"mc2_stderr": 0.015188166386714394
},
"harness|arc:challenge|25": {
"acc": 0.71160409556314,
"acc_stderr": 0.013238394422428173,
"acc_norm": 0.734641638225256,
"acc_norm_stderr": 0.012902554762313962
},
"harness|hellaswag|10": {
"acc": 0.7202748456482773,
"acc_stderr": 0.0044794676194648,
"acc_norm": 0.8860784704242183,
"acc_norm_stderr": 0.0031706661225176552
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7986111111111112,
"acc_stderr": 0.03353647469713839,
"acc_norm": 0.7986111111111112,
"acc_norm_stderr": 0.03353647469713839
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138208,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188723,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188723
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652457,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652457
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092444,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092444
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834834,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834834
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4201117318435754,
"acc_stderr": 0.016507671073256402,
"acc_norm": 0.4201117318435754,
"acc_norm_stderr": 0.016507671073256402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.01275285834653313,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.01275285834653313
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5593635250917993,
"mc1_stderr": 0.017379697555437442,
"mc2": 0.6881243184665276,
"mc2_stderr": 0.015188166386714394
},
"harness|winogrande|5": {
"acc": 0.8516179952644041,
"acc_stderr": 0.009990706005184135
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624184
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
syeda-raisa/idiom_classification | ---
license: apache-2.0
---
|
xNoper/gaofen_patch5000_binmask | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 991944376.0
num_examples: 5000
download_size: 961406175
dataset_size: 991944376.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bigbio/bionlp_st_2011_ge |
---
language:
- en
bigbio_language:
- English
license: cc-by-3.0
multilinguality: monolingual
bigbio_license_shortname: CC_BY_3p0
pretty_name: BioNLP 2011 GE
homepage: https://sites.google.com/site/bionlpst/bionlp-shared-task-2011/genia-event-extraction-genia
bigbio_pubmed: True
bigbio_public: True
bigbio_tasks:
- EVENT_EXTRACTION
- NAMED_ENTITY_RECOGNITION
- COREFERENCE_RESOLUTION
---
# Dataset Card for BioNLP 2011 GE
## Dataset Description
- **Homepage:** https://sites.google.com/site/bionlpst/bionlp-shared-task-2011/genia-event-extraction-genia
- **Pubmed:** True
- **Public:** True
- **Tasks:** EE,NER,COREF
The BioNLP-ST GE task has been promoting development of fine-grained information extraction (IE) from biomedical
documents, since 2009. Particularly, it has focused on the domain of NFkB as a model domain of Biomedical IE.
The GENIA task aims at extracting events occurring upon genes or gene products, which are typed as "Protein"
without differentiating genes from gene products. Other types of physical entities, e.g. cells, cell components,
are not differentiated from each other, and their type is given as "Entity".
## Citation Information
```
@inproceedings{10.5555/2107691.2107693,
author = {Kim, Jin-Dong and Wang, Yue and Takagi, Toshihisa and Yonezawa, Akinori},
title = {Overview of Genia Event Task in BioNLP Shared Task 2011},
year = {2011},
isbn = {9781937284091},
publisher = {Association for Computational Linguistics},
address = {USA},
abstract = {The Genia event task, a bio-molecular event extraction task,
is arranged as one of the main tasks of BioNLP Shared Task 2011.
As its second time to be arranged for community-wide focused
efforts, it aimed to measure the advance of the community since 2009,
and to evaluate generalization of the technology to full text papers.
After a 3-month system development period, 15 teams submitted their
performance results on test cases. The results show the community has
made a significant advancement in terms of both performance improvement
and generalization.},
booktitle = {Proceedings of the BioNLP Shared Task 2011 Workshop},
pages = {7–15},
numpages = {9},
location = {Portland, Oregon},
series = {BioNLP Shared Task '11}
}
```
|
joey234/mmlu-astronomy-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 70248
num_examples: 152
download_size: 42587
dataset_size: 70248
---
# Dataset Card for "mmlu-astronomy-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
neural-bridge/rag-full-20000 | ---
dataset_info:
features:
- name: clear_prompt
dtype: string
splits:
- name: train
num_bytes: 43183498.53262665
num_examples: 17433
- name: test
num_bytes: 10797732.467373349
num_examples: 4359
download_size: 32335855
dataset_size: 53981231
task_categories:
- question-answering
language:
- en
size_categories:
- 10K<n<100K
license: apache-2.0
tags:
- retrieval-augmented-generation
---
# **Retrieval-Augmented Generation (RAG) Full 20000**
**Retrieval-Augmented Generation (RAG) Full 20000 is an English dataset designed for RAG-optimized models, built by [Neural Bridge AI](https://www.neuralbridge.ai/), and released under [Apache license 2.0](https://www.apache.org/licenses/LICENSE-2.0.html).**
## **Dataset Description**
#### Dataset Summary
Retrieval-Augmented Generation (RAG) enhances large language models (LLMs) by allowing them to consult an external authoritative knowledge base before generating responses. This approach significantly boosts the models' ability to produce relevant, accurate, and context-specific output by extending their capabilities to specialized domains or an organization's internal data, without the need for retraining. RAG offers a cost-effective method to leverage the vast data processing power of LLMs, equipped with billions of parameters, for tasks such as question-answering, language translation, and sentence completion, ensuring that the output is always up-to-date and applicable to various contexts.
RAG's importance lies in its potential to address the inherent challenges of LLMs, such as unpredictability in responses, reliance on static and potentially outdated training data, and the risk of disseminating incorrect or non-authoritative information. These issues can negatively affect user trust in AI-powered applications, making RAG's ability to guide LLMs toward authoritative sources for information retrieval invaluable.
RAG has multiple benefits, including cost-effective implementation and maintenance, access to current information, improved user trust through accurate information and source attribution, and greater control for developers over the information retrieval process. This approach allows for the dynamic updating of LLMs with the latest research, statistics, or news, directly addressing the challenges of maintaining relevancy and accuracy in rapidly changing knowledge landscapes. Additionally, it empowers organizations to deploy generative AI more confidently across a wider range of applications, enhancing both the user experience and the reliability of AI-driven interactions.
Retrieval-Augmented Generation (RAG) Full 20000 dataset is a sigle-feature dataset, with each entry containing a "clear_prompt" field, designed to help build RAG-optimized models. This data consists of 20000 entries, and the data is from [Falcon RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb), [gsm8k](https://huggingface.co/datasets/gsm8k), and [RAG Hallucination Dataset 1000](https://huggingface.co/datasets/neural-bridge/rag-hallucination-dataset-1000).
```python
from datasets import load_dataset
rag_full = load_dataset("neural-bridge/rag-full-20000")
```
#### Languages
The text in the dataset is in English. The associated BCP-47 code is `en`.
## **Dataset Structure**
#### Data Instances
A typical data point comprises the "clear_prompt" field, which is the concatenation of "context" (optional), "question", and "answer" fields. The context is obtained from [Falcon RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) and [RAG Hallucination Dataset 1000](https://huggingface.co/datasets/neural-bridge/rag-hallucination-dataset-1000). The question and answer for each data point are neither obtained by [gsm8k](https://huggingface.co/datasets/gsm8k) nor generated by GPT-4.
An example from the dataset looks like the following:
```
{
clear_prompt: ...
}
```
#### Data Fields
- `clear_prompt`: A string consisting of a range of tokens. It includes the "context (optional)", "question", and "answer" fields between "##CONTEXT##", "##QUESTION##", and "##ANSWER##" tags respectively.
#### Data Splits
The data is split into a training and test set. The split sizes are as follow:
| | Train | Test |
| ----- | ------ | ---- |
| RAG Full 20000 | 17433 | 4359 |
## Source Data
The data points in the dataset are from the [Falcon RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb), [gsm8k](https://huggingface.co/datasets/gsm8k), and [RAG Hallucination Dataset 1000](https://huggingface.co/datasets/neural-bridge/rag-hallucination-dataset-1000) datasets.
## **Neural Bridge AI RAG Datasets Index**
| Model | Link |
| ----- | ------ |
| RAG Full 20000 | [link](https://huggingface.co/datasets/neural-bridge/rag-full-20000) |
| RAG Dataset 12000 | [link](https://huggingface.co/datasets/neural-bridge/rag-dataset-12000) |
| RAG Dataset 1200 | [link](https://huggingface.co/datasets/neural-bridge/rag-dataset-1200) |
| RAG Hallucination Dataset 1000 | [link](https://huggingface.co/datasets/neural-bridge/rag-hallucination-dataset-1000) |
## **License**
This public extract is made available under [Apache license 2.0](https://www.apache.org/licenses/LICENSE-2.0.html). Users should also abide to the [Falcon RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb), [gsm8k](https://huggingface.co/datasets/gsm8k), and [RAG Hallucination Dataset 1000](https://huggingface.co/datasets/neural-bridge/rag-hallucination-dataset-1000) ToUs. |
Tunyaluck/HateSpeechDataset | ---
license: apache-2.0
---
|
mstz/abalone | ---
language:
- en
tags:
- abalone
- tabular_regression
- regression
- binary_classification
pretty_name: Abalone
size_categories:
- 1K<n<10K
task_categories:
- tabular-regression
- tabular-classification
configs:
- abalone
- binary
license: cc
---
# Abalone
The [Abalone dataset](https://archive-beta.ics.uci.edu/dataset/1/abalone) from the [UCI ML repository](https://archive.ics.uci.edu/ml/datasets).
Predict the age of the given abalone.
# Configurations and tasks
| **Configuration** | **Task** | **Description** |
|-------------------|---------------------------|-----------------------------------------|
| abalone | Regression | Predict the age of the abalone. |
| binary | Binary classification | Does the abalone have more than 9 rings?|
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/abalone")["train"]
```
# Features
Target feature in bold.
|**Feature** |**Type** |
|-----------------------|---------------|
| sex | `[string]` |
| length | `[float64]` |
| diameter | `[float64]` |
| height | `[float64]` |
| whole_weight | `[float64]` |
| shucked_weight | `[float64]` |
| viscera_weight | `[float64]` |
| shell_weight | `[float64]` |
| **number_of_rings** | `[int8]` | |
AmliArt/face | ---
license: unknown
---
|
RUCAIBox/bbh | ---
license: mit
configs:
- config_name: boolean_expressions
data_files:
- split: dev
path: "dev/boolean_expressions.jsonl"
- split: test
path: "test/boolean_expressions.jsonl"
- config_name: causal_judgement
data_files:
- split: dev
path: "dev/causal_judgement.jsonl"
- split: test
path: "test/causal_judgement.jsonl"
- config_name: date_understanding
data_files:
- split: dev
path: "dev/date_understanding.jsonl"
- split: test
path: "test/date_understanding.jsonl"
- config_name: disambiguation_qa
data_files:
- split: dev
path: "dev/disambiguation_qa.jsonl"
- split: test
path: "test/disambiguation_qa.jsonl"
- config_name: dyck_languages
data_files:
- split: dev
path: "dev/dyck_languages.jsonl"
- split: test
path: "test/dyck_languages.jsonl"
- config_name: formal_fallacies
data_files:
- split: dev
path: "dev/formal_fallacies.jsonl"
- split: test
path: "test/formal_fallacies.jsonl"
- config_name: geometric_shapes
data_files:
- split: dev
path: "dev/geometric_shapes.jsonl"
- split: test
path: "test/geometric_shapes.jsonl"
- config_name: hyperbaton
data_files:
- split: dev
path: "dev/hyperbaton.jsonl"
- split: test
path: "test/hyperbaton.jsonl"
- config_name: logical_deduction_five_objects
data_files:
- split: dev
path: "dev/logical_deduction_five_objects.jsonl"
- split: test
path: "test/logical_deduction_five_objects.jsonl"
- config_name: logical_deduction_seven_objects
data_files:
- split: dev
path: "dev/logical_deduction_seven_objects.jsonl"
- split: test
path: "test/logical_deduction_seven_objects.jsonl"
- config_name: logical_deduction_three_objects
data_files:
- split: dev
path: "dev/logical_deduction_three_objects.jsonl"
- split: test
path: "test/logical_deduction_three_objects.jsonl"
- config_name: movie_recommendation
data_files:
- split: dev
path: "dev/movie_recommendation.jsonl"
- split: test
path: "test/movie_recommendation.jsonl"
- config_name: multistep_arithmetic_two
data_files:
- split: dev
path: "dev/multistep_arithmetic_two.jsonl"
- split: test
path: "test/multistep_arithmetic_two.jsonl"
- config_name: navigate
data_files:
- split: dev
path: "dev/navigate.jsonl"
- split: test
path: "test/navigate.jsonl"
- config_name: object_counting
data_files:
- split: dev
path: "dev/object_counting.jsonl"
- split: test
path: "test/object_counting.jsonl"
- config_name: penguins_in_a_table
data_files:
- split: dev
path: "dev/penguins_in_a_table.jsonl"
- split: test
path: "test/penguins_in_a_table.jsonl"
- config_name: reasoning_about_colored_objects
data_files:
- split: dev
path: "dev/reasoning_about_colored_objects.jsonl"
- split: test
path: "test/reasoning_about_colored_objects.jsonl"
- config_name: ruin_names
data_files:
- split: dev
path: "dev/ruin_names.jsonl"
- split: test
path: "test/ruin_names.jsonl"
- config_name: salient_translation_error_detection
data_files:
- split: dev
path: "dev/salient_translation_error_detection.jsonl"
- split: test
path: "test/salient_translation_error_detection.jsonl"
- config_name: snarks
data_files:
- split: dev
path: "dev/snarks.jsonl"
- split: test
path: "test/snarks.jsonl"
- config_name: sports_understanding
data_files:
- split: dev
path: "dev/sports_understanding.jsonl"
- split: test
path: "test/sports_understanding.jsonl"
- config_name: temporal_sequences
data_files:
- split: dev
path: "dev/temporal_sequences.jsonl"
- split: test
path: "test/temporal_sequences.jsonl"
- config_name: tracking_shuffled_objects_five_objects
data_files:
- split: dev
path: "dev/tracking_shuffled_objects_five_objects.jsonl"
- split: test
path: "test/tracking_shuffled_objects_five_objects.jsonl"
- config_name: tracking_shuffled_objects_seven_objects
data_files:
- split: dev
path: "dev/tracking_shuffled_objects_seven_objects.jsonl"
- split: test
path: "test/tracking_shuffled_objects_seven_objects.jsonl"
- config_name: tracking_shuffled_objects_three_objects
data_files:
- split: dev
path: "dev/tracking_shuffled_objects_three_objects.jsonl"
- split: test
path: "test/tracking_shuffled_objects_three_objects.jsonl"
- config_name: web_of_lies
data_files:
- split: dev
path: "dev/web_of_lies.jsonl"
- split: test
path: "test/web_of_lies.jsonl"
- config_name: word_sorting
data_files:
- split: dev
path: "dev/word_sorting.jsonl"
- split: test
path: "test/word_sorting.jsonl"
---
|
income/cqadupstack-gis-top-20-gen-queries | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
---
# NFCorpus: 20 generated queries (BEIR Benchmark)
This HF dataset contains the top-20 synthetic queries generated for each passage in the above BEIR benchmark dataset.
- DocT5query model used: [BeIR/query-gen-msmarco-t5-base-v1](https://huggingface.co/BeIR/query-gen-msmarco-t5-base-v1)
- id (str): unique document id in NFCorpus in the BEIR benchmark (`corpus.jsonl`).
- Questions generated: 20
- Code used for generation: [evaluate_anserini_docT5query_parallel.py](https://github.com/beir-cellar/beir/blob/main/examples/retrieval/evaluation/sparse/evaluate_anserini_docT5query_parallel.py)
Below contains the old dataset card for the BEIR benchmark.
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset.Top-20 generated queries for every passage in NFCorpus
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
Irza/dodol_irza | ---
license: cc-by-sa-3.0
---
|
text-machine-lab/vocab_filtered_dataset_22B | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 95741202256
num_examples: 142498558
download_size: 19794480275
dataset_size: 95741202256
---
# Dataset Card for "vocab_filtered_dataset_22B"
## Dataset Description
- **Paper: https://arxiv.org/abs/2404.02204**
- **Point of Contact: sherinbojappa_muckatira@student.uml.edu**
### Dataset Summary
This data is the simplified vocabulary-filtered pretraining data published by "Emergent Abilities in Reduced-Scale Generative Language Models". The vocabulary is derived from the AO-Childes speech corpus (https://github.com/UIUCLearningLanguageLab/AOCHILDES)
We filter the train split of SlimPajama dataset (https://huggingface.co/datasets/cerebras/SlimPajama-627B) based on the AO-Childes vocabulary retaining spans which contain integers, symbols, and words that belong to the AO-Childes vocabulary. Around 1.5% of Out of Vocabulary words are also allowed. A contiguous span of 32 tokens are selected. A span is delimited by start of span <s> and end of span </s> symbols.
### Citation Information
If this dataset is useful to you please cite our work.
```
@misc{muckatira2024emergent,
title={Emergent Abilities in Reduced-Scale Generative Language Models},
author={Sherin Muckatira and Vijeta Deshpande and Vladislav Lialin and Anna Rumshisky},
year={2024},
eprint={2404.02204},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
kuanhuggingface/promptTTS_ER_small | ---
dataset_info:
features:
- name: file
dtype: string
- name: id
dtype: string
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': cheerful
'1': neural
'2': sad
'3': shouting
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 6792824.0
num_examples: 40
- name: validation
num_bytes: 6792824.0
num_examples: 40
download_size: 10185218
dataset_size: 13585648.0
---
# Dataset Card for "promptTTS_ER_small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
indiehackers/winogrande_debiased-telugu-romanized | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: option1
dtype: string
- name: option2
dtype: string
- name: answer
dtype: string
- name: qas_id
dtype: int64
splits:
- name: train
num_bytes: 1485031
num_examples: 9248
- name: test
num_bytes: 280567
num_examples: 1767
- name: valid
num_bytes: 202851
num_examples: 1267
download_size: 1023456
dataset_size: 1968449
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
bigscience-data/roots_indic-te_pib | ---
language: te
license: cc-by-sa-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-te_pib
# pib
- Dataset uid: `pib`
### Description
Sentence aligned parallel corpus between 11 Indian Languages, crawled and extracted from the press information bureau
website.
### Homepage
- https://huggingface.co/datasets/pib
- http://preon.iiit.ac.in/~jerin/bhasha/
### Licensing
Creative Commons Attribution-ShareAlike 4.0 International
### Speaker Locations
### Sizes
- 0.0609 % of total
- 0.6301 % of indic-hi
- 3.2610 % of indic-ur
- 0.6029 % of indic-ta
- 3.0834 % of indic-or
- 1.9757 % of indic-mr
- 0.2181 % of indic-bn
- 1.8901 % of indic-pa
- 1.5457 % of indic-gu
- 0.4695 % of indic-ml
- 0.5767 % of indic-te
### BigScience processing steps
#### Filters applied to: indic-hi
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ur
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-or
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
#### Filters applied to: indic-mr
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-pa
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
yzhuang/metatree_fri_c2_1000_5 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 43440
num_examples: 724
- name: validation
num_bytes: 16560
num_examples: 276
download_size: 56761
dataset_size: 60000
---
# Dataset Card for "metatree_fri_c2_1000_5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vovadevico/fashion-gender-500 | ---
license: unlicense
---
|
agent-eto/eto-sft-trajectory | ---
configs:
- config_name: default
data_files:
- split: webshop
path: data/webshop_*
- split: scienceworld
path: data/sciworld_*
- split: alfworld
path: data/alfworld_*
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: webshop
num_examples: 1823
- name: scienceworld
num_examples: 1482
- name: alfworld
num_examples: 3118
language:
- en
pretty_name: ETO-SFT-Trajectory
license: apache-2.0
size_categories:
- 1K<n<10K
---
# Expert Trajectories for ETO
<p align="center">
<img src=https://raw.githubusercontent.com/Yifan-Song793/ETO/main/assets/main.png width=700/>
</p>
[**🌐 Homepage**](https://huggingface.co/spaces/agent-eto/Agent-ETO) | [**🐍 GitHub**](https://github.com/Yifan-Song793/ETO) | [**📖 arXiv**](https://arxiv.org/abs/2403.02502)
Expert trajectories for [Trial and Error: Exploration-Based Trajectory Optimization for LLM Agents](https://arxiv.org/abs/2403.02502)
Authors: [Yifan Song](https://github.com/Yifan-Song793), [Da Yin](https://wadeyin9712.github.io/), [Xiang Yue](https://xiangyue9607.github.io/), [Jie Huang](https://jeffhj.github.io/), [Sujian Li](http://123.56.88.210/), [Bill Yuchen Lin](https://yuchenlin.xyz/).
We introduce **ETO** (Exploration-based Trajectory Optimization), an agent learning framework inspired by "trial and error" process of human learning.
ETO allows an LLM agent to iteratively collect failure trajectories and updates its policy by learning from contrastive failure-success trajectory pairs.
**ETO** has following features:
- 🕹️ **Learning by Trial and Error**
- 🎲 **Learning from Failure Trajectories.** Contrary to previous approaches that exclusively train on successful expert trajectories, ETO allows agents to learn from their exploration failures.
- 🎭 **Contrastive Trajectory Optimization.** ETO applies DPO loss to perform policy learning from failure-success trajectory pairs.
- 🌏 **Iterative Policy Learning.** ETO can be expanded to multiple rounds for further policy enhancement.
- 🎖️ **Superior Performance**
- ⚔️ **Effectiveness on Three Datasets.** ETO significantly outperforms strong baselines, such as RFT, PPO, on [WebShop](https://webshop-pnlp.github.io/), [ScienceWorld](https://sciworld.apps.allenai.org/), and [ALFWorld](https://alfworld.github.io/).
- 🦾 **Generalization on Unseen Scenarios.** ETO demonstrates an impressive performance improvement of 22% over SFT on the challenging out-of-distribution test set in ScienceWorld.
- ⌛ **Task-Solving Efficiency.** ETO achieves higher rewards within fewer action steps on ScienceWorld.
- 💡 **Potential in Extreme Scenarios.** ETO shows better performance in self-play scenarios where expert trajectories are not available.
## Expert Trajectories
This dataset contains expert trajectories for three agent environments:
- **WebShop**: We preprocess the official [human demonstrations](https://github.com/princeton-nlp/WebShop/issues/21) provided by authors of WebShop. We also employ GPT-4 to explore in the environment and select trajectories with rewards greater than 0.7.
- **ScienceWorld**: The environment provides heuristic algorithm to generate golden trajectories.
- **ALFWorld**: The authors provide a few human-annotated trajectories for imitation learning.
Since the original trajectories do not contain CoT information for each action step, we utilize GPT-4 to generate the corresponding rationales.
## 🛠️ Setup & Evaluation
Please see our [GitHub Repo](https://github.com/Yifan-Song793/ETO).
## 📑 The Data Format for Training the Agent
```json
[
{
"id": "example_0",
"conversations": [
{
"from": "human",
"value": "Who are you?"
},
{
"from": "gpt",
"value": "I am Vicuna, a language model trained by researchers from Large Model Systems Organization (LMSYS)."
},
{
"from": "human",
"value": "Have a nice day!"
},
{
"from": "gpt",
"value": "You too!"
}
]
}
]
```
## 📖 Citation
If you find this dataset helpful, please cite out paper:
```
@article{song2024trial,
author={Yifan Song and Da Yin and Xiang Yue and Jie Huang and Sujian Li and Bill Yuchen Lin},
title={Trial and Error: Exploration-Based Trajectory Optimization for LLM Agents},
year={2024},
eprint={2403.02502},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
tonyshining/vlsp20_1proceed | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 11815806943.0
num_examples: 10000
download_size: 4769846911
dataset_size: 11815806943.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
xblaster/energetic-passport | ---
license: openrail
---
|
chikino/DEADPOOL1 | ---
license: openrail
---
|
open-llm-leaderboard/details_Walmart-the-bag__Solar-10.7B-Cato | ---
pretty_name: Evaluation run of Walmart-the-bag/Solar-10.7B-Cato
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Walmart-the-bag/Solar-10.7B-Cato](https://huggingface.co/Walmart-the-bag/Solar-10.7B-Cato)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Walmart-the-bag__Solar-10.7B-Cato\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T02:07:16.124496](https://huggingface.co/datasets/open-llm-leaderboard/details_Walmart-the-bag__Solar-10.7B-Cato/blob/main/results_2023-12-30T02-07-16.124496.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6601184986157275,\n\
\ \"acc_stderr\": 0.03173344410424321,\n \"acc_norm\": 0.6615926738267002,\n\
\ \"acc_norm_stderr\": 0.03237146731014547,\n \"mc1\": 0.4700122399020808,\n\
\ \"mc1_stderr\": 0.017471992091697544,\n \"mc2\": 0.6168232864590555,\n\
\ \"mc2_stderr\": 0.015630771495356736\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6390784982935154,\n \"acc_stderr\": 0.014034761386175452,\n\
\ \"acc_norm\": 0.6868600682593856,\n \"acc_norm_stderr\": 0.01355267154362349\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6845249950209121,\n\
\ \"acc_stderr\": 0.0046375504780073636,\n \"acc_norm\": 0.8615813582951604,\n\
\ \"acc_norm_stderr\": 0.0034463307489637123\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137282,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137282\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.03196758697835363,\n\
\ \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.03196758697835363\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.02573364199183898,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.02573364199183898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"\
acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n \"\
acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8737373737373737,\n \"acc_stderr\": 0.02366435940288023,\n \"\
acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.02366435940288023\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.02415222596280158,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.02415222596280158\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.014143970276657567,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.014143970276657567\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992005,\n\
\ \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992005\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3229050279329609,\n\
\ \"acc_stderr\": 0.015638440380241484,\n \"acc_norm\": 0.3229050279329609,\n\
\ \"acc_norm_stderr\": 0.015638440380241484\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7712418300653595,\n \"acc_stderr\": 0.024051029739912258,\n\
\ \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.024051029739912258\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817962,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817962\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48239895697522817,\n\
\ \"acc_stderr\": 0.012762321298823646,\n \"acc_norm\": 0.48239895697522817,\n\
\ \"acc_norm_stderr\": 0.012762321298823646\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.026431329870789527,\n\
\ \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.026431329870789527\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174923,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174923\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4700122399020808,\n\
\ \"mc1_stderr\": 0.017471992091697544,\n \"mc2\": 0.6168232864590555,\n\
\ \"mc2_stderr\": 0.015630771495356736\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8121546961325967,\n \"acc_stderr\": 0.010977481103435088\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6459438968915845,\n \
\ \"acc_stderr\": 0.013172728385222576\n }\n}\n```"
repo_url: https://huggingface.co/Walmart-the-bag/Solar-10.7B-Cato
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|arc:challenge|25_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|gsm8k|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hellaswag|10_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T02-07-16.124496.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T02-07-16.124496.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- '**/details_harness|winogrande|5_2023-12-30T02-07-16.124496.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T02-07-16.124496.parquet'
- config_name: results
data_files:
- split: 2023_12_30T02_07_16.124496
path:
- results_2023-12-30T02-07-16.124496.parquet
- split: latest
path:
- results_2023-12-30T02-07-16.124496.parquet
---
# Dataset Card for Evaluation run of Walmart-the-bag/Solar-10.7B-Cato
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Walmart-the-bag/Solar-10.7B-Cato](https://huggingface.co/Walmart-the-bag/Solar-10.7B-Cato) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Walmart-the-bag__Solar-10.7B-Cato",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T02:07:16.124496](https://huggingface.co/datasets/open-llm-leaderboard/details_Walmart-the-bag__Solar-10.7B-Cato/blob/main/results_2023-12-30T02-07-16.124496.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6601184986157275,
"acc_stderr": 0.03173344410424321,
"acc_norm": 0.6615926738267002,
"acc_norm_stderr": 0.03237146731014547,
"mc1": 0.4700122399020808,
"mc1_stderr": 0.017471992091697544,
"mc2": 0.6168232864590555,
"mc2_stderr": 0.015630771495356736
},
"harness|arc:challenge|25": {
"acc": 0.6390784982935154,
"acc_stderr": 0.014034761386175452,
"acc_norm": 0.6868600682593856,
"acc_norm_stderr": 0.01355267154362349
},
"harness|hellaswag|10": {
"acc": 0.6845249950209121,
"acc_stderr": 0.0046375504780073636,
"acc_norm": 0.8615813582951604,
"acc_norm_stderr": 0.0034463307489637123
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137282,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137282
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.03196758697835363,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.03196758697835363
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.02573364199183898,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.02573364199183898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.02366435940288023,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.02366435940288023
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.02415222596280158,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.02415222596280158
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657567,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657567
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992005,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992005
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3229050279329609,
"acc_stderr": 0.015638440380241484,
"acc_norm": 0.3229050279329609,
"acc_norm_stderr": 0.015638440380241484
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817962,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48239895697522817,
"acc_stderr": 0.012762321298823646,
"acc_norm": 0.48239895697522817,
"acc_norm_stderr": 0.012762321298823646
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7463235294117647,
"acc_stderr": 0.026431329870789527,
"acc_norm": 0.7463235294117647,
"acc_norm_stderr": 0.026431329870789527
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174923,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174923
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4700122399020808,
"mc1_stderr": 0.017471992091697544,
"mc2": 0.6168232864590555,
"mc2_stderr": 0.015630771495356736
},
"harness|winogrande|5": {
"acc": 0.8121546961325967,
"acc_stderr": 0.010977481103435088
},
"harness|gsm8k|5": {
"acc": 0.6459438968915845,
"acc_stderr": 0.013172728385222576
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zolak/twitter_dataset_81_1713143631 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 224200
num_examples: 600
download_size: 112761
dataset_size: 224200
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DynamicSuperb/NoiseSNRLevelPrediction_VCTK_MUSAN-Music | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 25743087.466964453
num_examples: 200
download_size: 25530114
dataset_size: 25743087.466964453
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "NoiseSNRLevelPredictionmusic_VCTKMusan"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_lgaalves__llama-2-7b-hf_open-platypus | ---
pretty_name: Evaluation run of lgaalves/llama-2-7b-hf_open-platypus
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lgaalves/llama-2-7b-hf_open-platypus](https://huggingface.co/lgaalves/llama-2-7b-hf_open-platypus)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__llama-2-7b-hf_open-platypus\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-16T18:18:23.592235](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__llama-2-7b-hf_open-platypus/blob/main/results_2023-10-16T18-18-23.592235.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.0003630560893118953,\n \"f1\": 0.05986052852348985,\n\
\ \"f1_stderr\": 0.0013631018920376853,\n \"acc\": 0.40511844075987347,\n\
\ \"acc_stderr\": 0.00954910251873735\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893118953,\n\
\ \"f1\": 0.05986052852348985,\n \"f1_stderr\": 0.0013631018920376853\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06595905989385899,\n \
\ \"acc_stderr\": 0.006836951192034225\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.744277821625888,\n \"acc_stderr\": 0.012261253845440474\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lgaalves/llama-2-7b-hf_open-platypus
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|arc:challenge|25_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_16T18_18_23.592235
path:
- '**/details_harness|drop|3_2023-10-16T18-18-23.592235.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-16T18-18-23.592235.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_16T18_18_23.592235
path:
- '**/details_harness|gsm8k|5_2023-10-16T18-18-23.592235.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-16T18-18-23.592235.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hellaswag|10_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_16T18_18_23.592235
path:
- '**/details_harness|winogrande|5_2023-10-16T18-18-23.592235.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-16T18-18-23.592235.parquet'
- config_name: results
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- results_2023-08-31T14:20:30.830996.parquet
- split: 2023_10_16T18_18_23.592235
path:
- results_2023-10-16T18-18-23.592235.parquet
- split: latest
path:
- results_2023-10-16T18-18-23.592235.parquet
---
# Dataset Card for Evaluation run of lgaalves/llama-2-7b-hf_open-platypus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/llama-2-7b-hf_open-platypus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/llama-2-7b-hf_open-platypus](https://huggingface.co/lgaalves/llama-2-7b-hf_open-platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__llama-2-7b-hf_open-platypus",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T18:18:23.592235](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__llama-2-7b-hf_open-platypus/blob/main/results_2023-10-16T18-18-23.592235.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893118953,
"f1": 0.05986052852348985,
"f1_stderr": 0.0013631018920376853,
"acc": 0.40511844075987347,
"acc_stderr": 0.00954910251873735
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893118953,
"f1": 0.05986052852348985,
"f1_stderr": 0.0013631018920376853
},
"harness|gsm8k|5": {
"acc": 0.06595905989385899,
"acc_stderr": 0.006836951192034225
},
"harness|winogrande|5": {
"acc": 0.744277821625888,
"acc_stderr": 0.012261253845440474
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Lechcher/Java | ---
license: apache-2.0
---
|
Ryan-sjtu/celebahq-caption | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2756863400.0
num_examples: 30000
download_size: 2762815442
dataset_size: 2756863400.0
---
|
goldstream/bolehpisan | ---
license: other
license_name: pelicula
license_link: LICENSE
---
|
aruca/meetingbank-gpt3.5 | ---
dataset_info:
features:
- name: summary
dtype: string
- name: uid
dtype: string
- name: id
dtype: int64
- name: transcript
dtype: string
splits:
- name: train
num_bytes: 19805900
num_examples: 3000
- name: validation
num_bytes: 2408688
num_examples: 400
- name: test
num_bytes: 2494155
num_examples: 400
download_size: 13487953
dataset_size: 24708743
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
hardikch05/NSText2SQL-custom-100000 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 181775689
num_examples: 100000
download_size: 31232036
dataset_size: 181775689
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sm2923/cs482-assignment1 | ---
pretty_name: CS482-Assignment1
dataset_info:
features:
- name: pipeline-1__longitude
dtype: float64
- name: pipeline-1__latitude
dtype: float64
- name: pipeline-1__housing_median_age
dtype: float64
- name: pipeline-1__total_rooms
dtype: float64
- name: pipeline-1__total_bedrooms
dtype: float64
- name: pipeline-1__population
dtype: float64
- name: pipeline-1__households
dtype: float64
- name: pipeline-1__median_income
dtype: float64
- name: pipeline-2__ocean_proximity_<1H OCEAN
dtype: float64
- name: pipeline-2__ocean_proximity_INLAND
dtype: float64
- name: pipeline-2__ocean_proximity_ISLAND
dtype: float64
- name: pipeline-2__ocean_proximity_NEAR BAY
dtype: float64
- name: pipeline-2__ocean_proximity_NEAR OCEAN
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1849344
num_examples: 16512
download_size: 966442
dataset_size: 1849344
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
VaibhavGp69/Aarogya_MedQuad-MedicalQnADataset | ---
dataset_info:
features:
- name: qtype
dtype: string
- name: Aarogya_prompt
dtype: string
- name: Question
dtype: string
- name: Answer
dtype: string
splits:
- name: train
num_bytes: 23877170
num_examples: 16407
download_size: 9148381
dataset_size: 23877170
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
haroldim/voz-femi-mult | ---
license: openrail
---
|
CyberHarem/ceobe_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ceobe/ケオベ/刻俄柏 (Arknights)
This is the dataset of ceobe/ケオベ/刻俄柏 (Arknights), containing 500 images and their tags.
The core tags of this character are `animal_ears, long_hair, dog_ears, brown_hair, dog_girl, red_eyes, breasts, very_long_hair, tail, hair_between_eyes, dog_tail, fang`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 834.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ceobe_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 411.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ceobe_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1227 | 881.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ceobe_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 705.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ceobe_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1227 | 1.34 GiB | [Download](https://huggingface.co/datasets/CyberHarem/ceobe_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ceobe_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, blush, jacket, simple_background, solo, upper_body, closed_mouth, looking_at_viewer, smile, white_background, long_sleeves |
| 1 | 5 |  |  |  |  |  | 1girl, :d, jacket, open_mouth, solo, upper_body, blush, looking_at_viewer, simple_background, skin_fang, white_background, long_sleeves, medium_breasts |
| 2 | 10 |  |  |  |  |  | 1girl, ears_through_headwear, hat, looking_at_viewer, official_alternate_costume, solo, twin_braids, white_headwear, :d, open_mouth, simple_background, white_background, black_jacket, blush, open_jacket, chain, long_sleeves, upper_body, skin_fang, black_scarf, black_skirt, brown_shirt, cowboy_shot |
| 3 | 5 |  |  |  |  |  | 1girl, black_jacket, black_skirt, chain, ears_through_headwear, hat, long_sleeves, official_alternate_costume, open_jacket, open_mouth, red_gloves, shirt, solo, twin_braids, white_headwear, :d, cowboy_shot, looking_at_viewer, oripathy_lesion_(arknights), simple_background, white_background, fangs, food, hands_up, scarf |
| 4 | 5 |  |  |  |  |  | 1girl, :d, jacket, long_sleeves, open_mouth, solo, thigh_boots, thighhighs, belt, dress, looking_at_viewer, simple_background, skin_fang, white_background, oripathy_lesion_(arknights), brown_eyes, cowboy_shot, large_breasts, weapon_on_back |
| 5 | 5 |  |  |  |  |  | 1girl, holding_weapon, jacket, long_sleeves, looking_at_viewer, solo, thigh_boots, thighhighs, belt, cowboy_shot, smile, closed_mouth, multiple_weapons, oripathy_lesion_(arknights), staff, infection_monitor_(arknights), large_breasts, open_mouth |
| 6 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, official_alternate_costume, orange_jacket, solo, bare_shoulders, open_mouth, simple_background, white_background, :d, collar, off_shoulder, chain, fangs, padlock, shoes, weapon, white_footwear |
| 7 | 8 |  |  |  |  |  | 1girl, hairclip, long_sleeves, onesie, pajamas, solo, hood, hugging_object, pillow_hug, white_background, looking_at_viewer, open_mouth, simple_background, barefoot, blush, dakimakura_(object), official_alternate_costume, ponytail, :d, full_body, cameo, character_doll, holding_pillow, lying, skin_fang |
| 8 | 6 |  |  |  |  |  | 2girls, blush, jacket, :d, open_mouth, solo_focus, upper_body, long_sleeves, holding, rabbit_ears |
| 9 | 9 |  |  |  |  |  | 1girl, blush, 1boy, hetero, nipples, open_mouth, penis, solo_focus, large_breasts, pussy, mosaic_censoring, spread_legs, bar_censor, navel, sex, smile, completely_nude, heart, looking_at_viewer, on_back, sweat, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | jacket | simple_background | solo | upper_body | closed_mouth | looking_at_viewer | smile | white_background | long_sleeves | :d | open_mouth | skin_fang | medium_breasts | ears_through_headwear | hat | official_alternate_costume | twin_braids | white_headwear | black_jacket | open_jacket | chain | black_scarf | black_skirt | brown_shirt | cowboy_shot | red_gloves | shirt | oripathy_lesion_(arknights) | fangs | food | hands_up | scarf | thigh_boots | thighhighs | belt | dress | brown_eyes | large_breasts | weapon_on_back | holding_weapon | multiple_weapons | staff | infection_monitor_(arknights) | orange_jacket | bare_shoulders | collar | off_shoulder | padlock | shoes | weapon | white_footwear | hairclip | onesie | pajamas | hood | hugging_object | pillow_hug | barefoot | dakimakura_(object) | ponytail | full_body | cameo | character_doll | holding_pillow | lying | 2girls | solo_focus | holding | rabbit_ears | 1boy | hetero | nipples | penis | pussy | mosaic_censoring | spread_legs | bar_censor | navel | sex | completely_nude | heart | on_back | sweat | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:---------|:--------------------|:-------|:-------------|:---------------|:--------------------|:--------|:-------------------|:---------------|:-----|:-------------|:------------|:-----------------|:------------------------|:------|:-----------------------------|:--------------|:-----------------|:---------------|:--------------|:--------|:--------------|:--------------|:--------------|:--------------|:-------------|:--------|:------------------------------|:--------|:-------|:-----------|:--------|:--------------|:-------------|:-------|:--------|:-------------|:----------------|:-----------------|:-----------------|:-------------------|:--------|:--------------------------------|:----------------|:-----------------|:---------|:---------------|:----------|:--------|:---------|:-----------------|:-----------|:---------|:----------|:-------|:-----------------|:-------------|:-----------|:----------------------|:-----------|:------------|:--------|:-----------------|:-----------------|:--------|:---------|:-------------|:----------|:--------------|:-------|:---------|:----------|:--------|:--------|:-------------------|:--------------|:-------------|:--------|:------|:------------------|:--------|:----------|:--------|:----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | X | | X | X | X | | X | | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | X | | | X | | X | X | X | X | | | X | X | X | X | X | X | X | X | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | X | X | | | X | | X | X | X | X | X | | | | | | | | | | | | | X | | | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | | X | | X | X | X | | X | | X | | | | | | | | | | | | | | X | | | X | | | | | X | X | X | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | X | X | | | X | | X | | X | X | | | | | X | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 8 |  |  |  |  |  | X | X | | X | X | | | X | | X | X | X | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | | X | X | | | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | |
| 9 | 9 |  |  |  |  |  | X | X | | | | | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
alfredplpl/wikipedia-simple-ja-100k | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 26728437
num_examples: 106876
download_size: 0
dataset_size: 26728437
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-sa-3.0
task_categories:
- summarization
language:
- ja
---
# Dataset Card for "wikipedia-simple-ja-100k"
# Original Dataset
- hpprc/wikipedia-20240101
# Procedure
- Exract the first line of the title from the dataset.
- Generate the answer by summizing the line using LLM:
- Input RAG-like prompt to CALM 2 7B Chat.
- Format the response.
# RAG-like Prompt
```python
f"""USER: {title}とはなんですか?次の文章を参考に一言でまとめてください。{text}
ASSISTANT: """
``` |
sujantkumarkv/black_law_dictionary | ---
license: cc-by-nc-sa-4.0
---
|
yoyoyoyoinf/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yzhuang/metatree_fri_c4_1000_50 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 296100
num_examples: 705
- name: validation
num_bytes: 123900
num_examples: 295
download_size: 504225
dataset_size: 420000
---
# Dataset Card for "metatree_fri_c4_1000_50"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
castorini/mr-tydi | ---
language:
- ar
- bn
- en
- fi
- id
- fi
- ja
- ko
- ru
- sw
- te
- th
multilinguality:
- multilingual
task_categories:
- text-retrieval
license: apache-2.0
---
# Dataset Summary
Mr. TyDi is a multi-lingual benchmark dataset built on TyDi, covering eleven typologically diverse languages. It is designed for monolingual retrieval, specifically to evaluate ranking with learned dense representations.
This dataset stores the queries, judgements, and example training data of Mr. TyDi. To access the corpus, please refer to [castorini/mr-tydi-corpus](https://huggingface.co/datasets/castorini/mr-tydi-corpus).
# Dataset Structure
The only configuration here is the `language`,
For each language, there are three splits: `train`, `dev`, and `test`.
The negative examples from training set are sampled from the top-30 BM25 runfiles on each language.
Specifically, we combine the **training** data for all languages under the `combined` configuration.
An example of `train` set looks as follows:
```
{
'query_id': '1',
'query': 'When was quantum field theory developed?',
'positive_passages': [
{
'docid': '25267#12',
'title': 'Quantum field theory',
'text': 'Quantum field theory naturally began with the study of electromagnetic interactions, as the electromagnetic field was the only known classical field as of the 1920s.'
},
...
]
'negative_passages': [
{
'docid': '346489#8',
'title': 'Local quantum field theory',
'text': 'More recently, the approach has been further implemented to include an algebraic version of quantum field ...'
},
...
],
}
```
An example of `dev` and `test` set looks as follows. We only provide the docid of positive passages here to save the space.
Also no candidate passages are provided at this point.
Note that to perform the retrieval, it need to be used together with [castorini/mr-tydi-corpus](https://huggingface.co/datasets/castorini/mr-tydi-corpus)
```
{
'query_id': '0',
'query': 'Is Creole a pidgin of French?',
'positive_passages': [
{
'docid': '3716905#1',
'title': '',
'text': ''
},
...
]
}
```
# Load Dataset
An example to load the dataset:
```
language = 'english'
# to load all train, dev and test sets
dataset = load_dataset('castorini/mr-tydi', language)
# or to load a specific set:
set_name = 'train'
dataset = load_dataset('castorini/mr-tydi', language, set_name)
```
Note that the 'combined' option has only the 'train' set.
# Citation Information
```
@article{mrtydi,
title={{Mr. TyDi}: A Multi-lingual Benchmark for Dense Retrieval},
author={Xinyu Zhang and Xueguang Ma and Peng Shi and Jimmy Lin},
year={2021},
journal={arXiv:2108.08787},
}
```
|
Atnafu/Parallel_dataset_for_Ethiopian_languages | ---
license: afl-3.0
---
This dataset contains parallel corpora for Ethiopian languages |
mnoukhov/summarize_from_feedback_tldr3_generated_20k_vllm_pythia1b_dpo_temp0.7 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 36449155
num_examples: 19999
download_size: 22341908
dataset_size: 36449155
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibivibiv/alpaca_lamini1 | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 56235783
num_examples: 129279
download_size: 36318072
dataset_size: 56235783
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
talalcringe/taxi_fares | ---
dataset_info:
features:
- name: key
dtype: string
- name: fare_amount
dtype: float64
- name: pickup_datetime
dtype: string
- name: pickup_longitude
dtype: float64
- name: pickup_latitude
dtype: float64
- name: dropoff_longitude
dtype: float64
- name: dropoff_latitude
dtype: float64
- name: passenger_count
dtype: int64
splits:
- name: test
num_bytes: 853844
num_examples: 9914
- name: train
num_bytes: 5912642536
num_examples: 55423856
download_size: 3775451814
dataset_size: 5913496380
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
---
|
llm-aes/summeval-annotated-full | ---
dataset_info:
features:
- name: task_id
dtype: string
- name: worker_id
dtype: string
- name: human_label
dtype: int64
- name: llm_label
dtype: int64
- name: generator_1
dtype: string
- name: generator_2
dtype: string
- name: premise
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 105581934
num_examples: 48000
download_size: 7225191
dataset_size: 105581934
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HDBrinkmann/HDBTEST4PLAN02 | ---
license: apache-2.0
language:
- de
tags:
- finance
size_categories:
- 1K<n<10K
--- |
TrainingDataPro/spine-x-ray | ---
license: cc-by-nc-nd-4.0
task_categories:
- image-classification
- image-segmentation
- image-to-image
language:
- en
tags:
- medical
- code
---
# Spine X-rays
The dataset consists of a collection of spine X-ray images in **.jpg and .dcm** formats. The images are organized into folders based on different medical conditions related to the spine. Each folder contains images depicting specific spinal deformities.
### Types of diseases and conditions in the dataset:
*Scoliosis, Osteochondrosis, Osteoporosis, Spondylolisthesis, Vertebral Compression Fractures (VCFs), Disability, Other and Healthy*

The dataset provides an opportunity for researchers and medical professionals to *analyze and develop algorithms for automated diagnosis, treatment planning, and prognosis estimation of* **various spinal conditions**.
It allows the development and evaluation of computer-based algorithms, machine learning models, and deep learning techniques for **automated detection, diagnosis, and classification** of these conditions.
# Get the Dataset
## This is just an example of the data
Leave a request on [https://trainingdata.pro/data-market](https://trainingdata.pro/data-market/spine-mri?utm_source=huggingface&utm_medium=cpc&utm_campaign=spine-x-ray) to discuss your requirements, learn about the price and buy the dataset
# Content
### The folder "files" includes 8 folders:
- corresponding to name of the disease/condition and including x-rays of people with this disease/condition (**scoliosis, osteochondrosis, VCFs etc.**)
- including x-rays in 2 different formats: **.jpg and .dcm**.
### File with the extension .csv includes the following information for each media file:
- **dcm**: link to access the .dcm file,
- **jpg**: link to access the .jpg file,
- **type**: name of the disease or condition on the x-ray
# Medical data might be collected in accordance with your requirements.
## [TrainingData](https://trainingdata.pro/data-market/spine-mri?utm_source=huggingface&utm_medium=cpc&utm_campaign=spine-x-ray) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/trainingdata-pro**
*keywords: spine dataset, spine X-rays dataset, scoliosis detection dataset, scoliosis segmentation dataset, scoliosis image dataset, medical imaging, radiology dataset, spine deformity dataset, orthopedic abnormalities, scoliotic curve dataset, degenerative spinal conditions, diagnostic imaging of the spine, osteoporosis dataset, osteochondrosis dataset, vertebral compression fracture detection, vertebral segmentation dataset*
|
Atipico1/nq-test-adv-replace-v3 | ---
dataset_info:
features:
- name: question
dtype: string
- name: entity
dtype: string
- name: similar_entity
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: masked_query
dtype: string
- name: original_case
list:
- name: answer
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: unans_case
list:
- name: answer
dtype: string
- name: answers
sequence: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: conflict_case
list:
- name: answer
dtype: string
- name: conflict_context
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: context
dtype: string
- name: context_vague
dtype: string
- name: entities
dtype: string
- name: entities_count
dtype: int64
- name: adv_sent
dtype: string
- name: adv_passage
dtype: string
- name: hasanswer
dtype: bool
- name: is_adversarial
dtype: bool
splits:
- name: test
num_bytes: 57386242
num_examples: 3610
download_size: 32792526
dataset_size: 57386242
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
nateraw/ade20k-tiny | ---
annotations_creators:
- crowdsourced
- expert-generated
language_creators:
- found
language:
- en
license:
- bsd-3-clause
multilinguality:
- monolingual
size_categories:
- n<1K
source_datasets:
- extended|ade20k
task_categories:
- image-segmentation
task_ids:
- semantic-segmentation
pretty_name: ADE 20K Tiny
---
# Dataset Card for ADE 20K Tiny
This is a tiny subset of the ADE 20K dataset, which you can find [here](https://huggingface.co/datasets/scene_parse_150). |
skrishna/coin_flip_7_transformed | ---
dataset_info:
features:
- name: targets
dtype: string
- name: targets_vec
sequence: int64
- name: inputs
dtype: string
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 1097256
num_examples: 2000
- name: train
num_bytes: 1097824
num_examples: 2000
download_size: 573549
dataset_size: 2195080
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
---
|
rishiraj/no_robots | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: category
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 28805395
num_examples: 9500
- name: test
num_bytes: 1545168
num_examples: 500
download_size: 18891461
dataset_size: 30350563
---
# Dataset Card for "no_robots"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thaisum | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- th
license:
- mit
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- summarization
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
paperswithcode_id: null
pretty_name: ThaiSum
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
- name: summary
dtype: string
- name: type
dtype: string
- name: tags
dtype: string
- name: url
dtype: string
config_name: thaisum
splits:
- name: train
num_bytes: 2945472406
num_examples: 358868
- name: validation
num_bytes: 118437310
num_examples: 11000
- name: test
num_bytes: 119496704
num_examples: 11000
download_size: 647582078
dataset_size: 3183406420
---
# Dataset Card for ThaiSum
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/nakhunchumpolsathien/ThaiSum
- **Repository:** https://github.com/nakhunchumpolsathien/ThaiSum
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** https://github.com/nakhunchumpolsathien
### Dataset Summary
ThaiSum is a large-scale corpus for Thai text summarization obtained from several online news websites namely Thairath, ThaiPBS, Prachathai, and The Standard. This dataset consists of over 350,000 article and summary pairs written by journalists.
### Supported Tasks and Leaderboards
summarization, language modeling
### Languages
Thai
## Dataset Structure
### Data Instances
```
{'body': 'กีเก ซานเชซ ฟลอเรส\xa0 กุนซือเลือดกระทิงของทีมวัตฟอร์ด\xa0 เมินประเด็นจุดโทษปัญหาในเกมพรีเมียร์ลีก อังกฤษ นัดที่แตนอาละวาดเปิดบ้านพ่าย คริสตัล พาเลซ 0-1ชี้ทีมของเขาเล่นไม่ดีพอเอง,สำนักข่าวต่างประเทศรายงานวันที่ 27 ก.ย. ว่า กีเก ซานเชซ ฟลอเรส\xa0 ผู้จัดการทีมชาวสเปน ของ แตนอาละวาด วัตฟอร์ด\xa0 ยอมรับทีมของเขาเล่นได้ไม่ดีพอเอง ในเกมพรีเมียร์ลีก อังกฤษ นัดเปิดบ้านพ่าย อินทรีผงาด คริสตัล พาเลซ 0-1 เมื่อคืนวันอาทิตย์ที่ผ่านมา,เกมนี้จุดเปลี่ยนมาอยู่ที่การได้จุดโทษในช่วงครึ่งหลังของ คริสตัล พาเลซ ซึ่งไม่ค่อยชัดเจนเท่าไหร่ว่า อัลลัน นียอม นั้นไปทำฟาล์วใส่ วิลฟรีด ซาฮา ในเขตโทษหรือไม่ แต่ผู้ตัดสินก็ชี้เป็นจุดโทษ ซึ่ง โยอัน กาบาย สังหารไม่พลาด และเป็นประตูชัยช่วยให้ คริสตัล พาเลซ เอาชนะ วัตฟอร์ด ไป 1-0 และเป็นการพ่ายแพ้ในบ้านนัดแรกของวัตฟอร์ดในฤดูกาลนี้อีกด้วย,ฟลอเรส กล่าวว่า มันเป็นเรื่องยากในการหยุดเกมรุกของคริสตัล พาเลซ ซึ่งมันอึดอัดจริงๆสำหรับเรา เราเล่นกันได้ไม่ดีนักในตอนที่ได้ครองบอล เราต้องเล่นทางริมเส้นให้มากกว่านี้ เราไม่สามารถหยุดเกมสวนกลับของพวกเขาได้ และแนวรับของเราก็ยืนไม่เป็นระเบียบสักเท่าไหร่ในช่วงครึ่งแรก ส่วนเรื่องจุดโทษการตัดสินใจขั้นสุดท้ายมันอยู่ที่ผู้ตัดสิน ซึ่งมันเป็นการตัดสินใจที่สำคัญ ผมเองก็ไม่รู้ว่าเขาตัดสินถูกหรือเปล่า บางทีมันอาจเป็นจุดที่ตัดสินเกมนี้เลย แต่เราไม่ได้แพ้เกมนี้เพราะจุดโทษ เราแพ้ในวันนี้เพราะเราเล่นไม่ดีและคริสตัล พาเลซ เล่นดีกว่าเรา เราไม่ได้มีฟอร์มการเล่นที่ดีในเกมนี้เลย', 'summary': 'กีเก ซานเชซ ฟลอเรส กุนซือเลือดกระทิงของทีมวัตฟอร์ด เมินประเด็นจุดโทษปัญหาในเกมพรีเมียร์ลีก อังกฤษ นัดที่แตนอาละวาดเปิดบ้านพ่าย คริสตัล พาเลซ 0-1ชี้ทีมของเขาเล่นไม่ดีพอเอง', 'tags': 'พรีเมียร์ลีก,วัตฟอร์ด,คริสตัล พาเลซ,กีเก ซานเชซ ฟลอเรส,ข่าวกีฬา,ข่าว,ไทยรัฐออนไลน์', 'title': 'ฟลอเรส รับ วัตฟอร์ดห่วยเองเกมพ่ายพาเลซคาบ้าน', 'type': '', 'url': 'https://www.thairath.co.th/content/528322'}
```
### Data Fields
- `title`: title of article
- `body`: body of article
- `summary`: summary of article
- `type`: type of article, if any
- `tags`: tags of article, separated by `,`
- `url`: URL of article
### Data Splits
train/valid/test: 358868 / 11000 / 11000
## Dataset Creation
### Curation Rationale
Sequence-to-sequence (Seq2Seq) models have shown great achievement in text summarization. However, Seq2Seq model often requires large-scale training data to achieve effective results. Although many impressive advancements in text summarization field have been made, most of summarization studies focus on resource-rich languages. The progress of Thai text summarization is still far behind. The dearth of large-scale dataset keeps Thai text summarization in its infancy. As far as our knowledge goes, there is not a large-scale dataset for Thai text summarization available anywhere. Thus, we present ThaiSum, a large-scale corpus for Thai text summarization obtained from several online news websites namely Thairath, ThaiPBS, Prachathai, and The Standard.
### Source Data
#### Initial Data Collection and Normalization
We used a python library named Scrapy to crawl articles from several news websites namely Thairath, Prachatai, ThaiPBS and, The Standard. We first collected news URLs provided in their sitemaps. During web-crawling, we used HTML markup and metadata available in HTML pages to identify article text, summary, headline, tags and label. Collected articles were published online from 2014 to August 2020. <br> <br>
We further performed data cleansing process to minimize noisy data. We filtered out articles that their article text or summary is missing. Articles that contains article text with less than 150 words or summary with less than 15 words were removed. We also discarded articles that contain at least one of these following tags: ‘ดวง’ (horoscope), ‘นิยาย’ (novel), ‘อินสตราแกรมดารา’ (celebrity Instagram), ‘คลิปสุดฮา’(funny video) and ‘สรุปข่าว’ (highlight news). Some summaries were completely irrelevant to their original article texts. To eliminate those irrelevant summaries, we calculated abstractedness score between summary and its article text. Abstractedness score is written formally as: <br>
<center><a href="https://www.codecogs.com/eqnedit.php?latex=\begin{equation}&space;\frac{|S-A|}{r}&space;\times&space;100&space;\end{equation}" target="_blank"><img src="https://latex.codecogs.com/gif.latex?\begin{equation}&space;\frac{|S-A|}{r}&space;\times&space;100&space;\end{equation}" title="\begin{equation} \frac{|S-A|}{r} \times 100 \end{equation}" /></a></center><br>
<br>Where 𝑆 denotes set of article tokens. 𝐴 denotes set of summary tokens. 𝑟 denotes a total number of summary tokens. We omitted articles that have abstractedness score at 1-grams higher than 60%.
<br><br>
It is important to point out that we used [PyThaiNLP](https://github.com/PyThaiNLP/pythainlp), version 2.2.4, tokenizing engine = newmm, to process Thai texts in this study. It is challenging to tokenize running Thai text into words or sentences because there are not clear word/sentence delimiters in Thai language. Therefore, using different tokenization engines may result in different segment of words/sentences.
After data-cleansing process, ThaiSum dataset contains over 358,000 articles. The size of this dataset is comparable to a well-known English document summarization dataset, CNN/Dily mail dataset. Moreover, we analyse the characteristics of this dataset by measuring the abstractedness level, compassion rate, and content diversity. For more details, see [thaisum_exploration.ipynb](https://github.com/nakhunchumpolsathien/ThaiSum/blob/master/thaisum_exploration.ipynb).
#### Dataset Statistics
ThaiSum dataset consists of 358,868 articles. Average lengths of article texts and summaries are approximately 530 and 37 words respectively. As mentioned earlier, we also collected headlines, tags and labels provided in each article. Tags are similar to keywords of the article. An article normally contains several tags but a few labels. Tags can be name of places or persons that article is about while labels indicate news category (politic, entertainment, etc.). Ultimatly, ThaiSum contains 538,059 unique tags and 59 unique labels. Note that not every article contains tags or labels.
|Dataset Size| 358,868 | articles |
|:---|---:|---:|
|Avg. Article Length| 529.5 | words|
|Avg. Summary Length | 37.3 | words|
|Avg. Headline Length | 12.6 | words|
|Unique Vocabulary Size | 407,355 | words|
|Occurring > 10 times | 81,761 | words|
|Unique News Tag Size | 538,059 | tags|
|Unique News Label Size | 59 | labels|
#### Who are the source language producers?
Journalists of respective articles
### Annotations
#### Annotation process
`summary`, `type` and `tags` are created by journalists who wrote the articles and/or their publishers.
#### Who are the annotators?
`summary`, `type` and `tags` are created by journalists who wrote the articles and/or their publishers.
### Personal and Sensitive Information
All data are public news articles. No personal and sensitive information is expected to be included.
## Considerations for Using the Data
### Social Impact of Dataset
- News summarization in Thai
- Language modeling for Thai news
### Discussion of Biases
- [ThaiPBS](https://www.thaipbs.or.th/home) [receives funding from Thai government](https://www.bangkokbiznews.com/blog/detail/648740).
- [Thairath](https://www.thairath.co.th/) is known as [the most popular newspaper in Thailand](https://mgronline.com/onlinesection/detail/9620000058532); no clear political leaning.
- [The Standard](https://thestandard.co/) is a left-leaning online magazine.
- [Prachathai](https://prachatai.com/) is a left-leaning, human-right-focused news site.
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[@nakhunchumpolsathien](https://github.com/nakhunchumpolsathien/)
[@caramelWaffle](https://github.com/caramelWaffle)
### Licensing Information
MIT License
### Citation Information
```
@mastersthesis{chumpolsathien_2020,
title={Using Knowledge Distillation from Keyword Extraction to Improve the Informativeness of Neural Cross-lingual Summarization},
author={Chumpolsathien, Nakhun},
year={2020},
school={Beijing Institute of Technology}
```
### Contributions
Thanks to [@cstorm125](https://github.com/cstorm125) for adding this dataset. |
wangxinze/Verilog_data | ---
license: apache-2.0
---
|
xwjzds/pretrain_sts_long | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 9557417
num_examples: 38151
download_size: 6115013
dataset_size: 9557417
---
Dataset Card for Sentence Paraphase Collections
Dataset Description Repository: Paper: DeTiME: Diffusion-Enhanced Topic Modeling using Encoder-decoder based LLM https://arxiv.org/abs/2310.15296
Leaderboard: Point of Contact: Weijie Xu
Dataset Summary Sentence_Paraphase is a combination of sentences paraphase tasks from various sources such as paraphase using ChatGPT, Paraphrase Adversaries from Word Scrambling (PAWS) and STS benchmark. We filtered out pairs that are detected as non english, too short or not have high similarity score.
Category Count Paraphrase 223241
Dataset Structure Data Instances An example of data as follows: {'input': 'U.S. prosecutors have arrested more than 130 individuals and have seized more than $17 million in a continuing crackdown on Internet fraud and abuse.', 'output': 'More than 130 people have been arrested and $17 million worth of property seized in an Internet fraud sweep announced Friday by three U.S. government agencies.'}
Data Fields The data fields are as follows:
input and output are paraphrase of a sentence or paragraph.
Dataset Creation Curation Rationale [More Information Needed]
Source Data Initial Data Collection and Normalization [More Information Needed]
Who are the source language producers? [More Information Needed]
Annotations Annotation process [More Information Needed]
Who are the annotators? [More Information Needed]
Personal and Sensitive Information [More Information Needed]
Considerations for Using the Data Social Impact of Dataset [More Information Needed]
Discussion of Biases [More Information Needed]
Other Known Limitations [More Information Needed]
Additional Information Dataset Curators [More Information Needed]
Licensing Information The dataset is available under the Creative Commons NonCommercial (CC BY-NC 4.0).
Citation Information @misc{xu2023detime, title={DeTiME: Diffusion-Enhanced Topic Modeling using Encoder-decoder based LLM}, author={Weijie Xu and Wenxiang Hu and Fanyou Wu and Srinivasan Sengamedu}, year={2023}, eprint={2310.15296}, archivePrefix={arXiv}, primaryClass={cs.CL} } |
DynamicSuperbPrivate/SpokenTermDetection_Tedlium2Train | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: text
dtype: string
- name: instruction
dtype: string
- name: label
dtype: string
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 15786905536.68
num_examples: 92967
- name: validation
num_bytes: 117079048.0
num_examples: 507
download_size: 15262598420
dataset_size: 15903984584.68
---
# Dataset Card for "SpokenTermDetection_Tedlium2Train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rubend18/ChatGPT-Jailbreak-Prompts | ---
task_categories:
- question-answering
- text-generation
- fill-mask
- zero-shot-classification
- table-question-answering
language:
- en
- aa
tags:
- ChatGPT
- JailbreakPrompts
- LanguageModeling
- ArtificialIntelligence
- TextGeneration
- Dataset
- OpenAI
- Jailbreak
- Prompts
size_categories:
- n<1K
pretty_name: ChatGPT Jailbreak Prompts
---
# Dataset Card for Dataset Name
## Name
ChatGPT Jailbreak Prompts
## Dataset Description
- **Autor:** Rubén Darío Jaramillo
- **Email:** rubend18@hotmail.com
- **WhatsApp:** +593 93 979 6676
### Dataset Summary
ChatGPT Jailbreak Prompts is a complete collection of jailbreak related prompts for ChatGPT. This dataset is intended to provide a valuable resource for understanding and generating text in the context of jailbreaking in ChatGPT.
### Languages
[English] |
MohammedNasri/cv11_ar_denoisy | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: 'null'
- name: sampling_rate
dtype: int64
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 5817720018
num_examples: 10440
download_size: 2954294028
dataset_size: 5817720018
---
# Dataset Card for "cv11_ar_denoisy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/akatsuki_kirika_senkizesshousymphogear | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Akatsuki Kirika
This is the dataset of Akatsuki Kirika, containing 300 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 300 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 717 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 300 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 300 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 300 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 300 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 300 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 717 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 717 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 717 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
mmuttharasan/llmjptk2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 81960.0
num_examples: 10
- name: test
num_bytes: 16392.0
num_examples: 2
download_size: 42049
dataset_size: 98352.0
---
# Dataset Card for "llmjptk2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DynamicSuperb/LanguageIdentification_VoxForge | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 38570550.0
num_examples: 200
download_size: 37425191
dataset_size: 38570550.0
---
# Dataset Card for "LanguageIdentification_VoxForge"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_liminerity__Blur-7b-slerp-v1.46 | ---
pretty_name: Evaluation run of liminerity/Blur-7b-slerp-v1.46
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [liminerity/Blur-7b-slerp-v1.46](https://huggingface.co/liminerity/Blur-7b-slerp-v1.46)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__Blur-7b-slerp-v1.46\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T18:46:59.781015](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7b-slerp-v1.46/blob/main/results_2024-02-29T18-46-59.781015.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6501476948387063,\n\
\ \"acc_stderr\": 0.03209960685561106,\n \"acc_norm\": 0.6493927389532388,\n\
\ \"acc_norm_stderr\": 0.0327717827872929,\n \"mc1\": 0.6181150550795593,\n\
\ \"mc1_stderr\": 0.017008101939163498,\n \"mc2\": 0.7661412005458291,\n\
\ \"mc2_stderr\": 0.013951105204747587\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7098976109215017,\n \"acc_stderr\": 0.013261573677520767,\n\
\ \"acc_norm\": 0.7329351535836177,\n \"acc_norm_stderr\": 0.012928933196496363\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7164907388966342,\n\
\ \"acc_stderr\": 0.004497803024345146,\n \"acc_norm\": 0.8906592312288388,\n\
\ \"acc_norm_stderr\": 0.0031142850772280318\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.0253795249107784,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.0253795249107784\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513537,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513537\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n\
\ \"acc_stderr\": 0.024857478080250447,\n \"acc_norm\": 0.8529411764705882,\n\
\ \"acc_norm_stderr\": 0.024857478080250447\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n\
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n\
\ \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n\
\ \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.012739711554045702,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.012739711554045702\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6181150550795593,\n\
\ \"mc1_stderr\": 0.017008101939163498,\n \"mc2\": 0.7661412005458291,\n\
\ \"mc2_stderr\": 0.013951105204747587\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8453038674033149,\n \"acc_stderr\": 0.010163172650433537\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6967399545109931,\n \
\ \"acc_stderr\": 0.012661502663418697\n }\n}\n```"
repo_url: https://huggingface.co/liminerity/Blur-7b-slerp-v1.46
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|arc:challenge|25_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|gsm8k|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hellaswag|10_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-46-59.781015.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T18-46-59.781015.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- '**/details_harness|winogrande|5_2024-02-29T18-46-59.781015.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T18-46-59.781015.parquet'
- config_name: results
data_files:
- split: 2024_02_29T18_46_59.781015
path:
- results_2024-02-29T18-46-59.781015.parquet
- split: latest
path:
- results_2024-02-29T18-46-59.781015.parquet
---
# Dataset Card for Evaluation run of liminerity/Blur-7b-slerp-v1.46
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/Blur-7b-slerp-v1.46](https://huggingface.co/liminerity/Blur-7b-slerp-v1.46) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__Blur-7b-slerp-v1.46",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T18:46:59.781015](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7b-slerp-v1.46/blob/main/results_2024-02-29T18-46-59.781015.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6501476948387063,
"acc_stderr": 0.03209960685561106,
"acc_norm": 0.6493927389532388,
"acc_norm_stderr": 0.0327717827872929,
"mc1": 0.6181150550795593,
"mc1_stderr": 0.017008101939163498,
"mc2": 0.7661412005458291,
"mc2_stderr": 0.013951105204747587
},
"harness|arc:challenge|25": {
"acc": 0.7098976109215017,
"acc_stderr": 0.013261573677520767,
"acc_norm": 0.7329351535836177,
"acc_norm_stderr": 0.012928933196496363
},
"harness|hellaswag|10": {
"acc": 0.7164907388966342,
"acc_stderr": 0.004497803024345146,
"acc_norm": 0.8906592312288388,
"acc_norm_stderr": 0.0031142850772280318
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.0253795249107784,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.0253795249107784
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513537,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513537
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.01654788799741611,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.01654788799741611
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045702,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045702
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6181150550795593,
"mc1_stderr": 0.017008101939163498,
"mc2": 0.7661412005458291,
"mc2_stderr": 0.013951105204747587
},
"harness|winogrande|5": {
"acc": 0.8453038674033149,
"acc_stderr": 0.010163172650433537
},
"harness|gsm8k|5": {
"acc": 0.6967399545109931,
"acc_stderr": 0.012661502663418697
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Weni/LLM_Base_2.0.3_SFT_negative_reduction | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: question
dtype: string
- name: answear
dtype: string
- name: context
dtype: string
- name: correct_ans
dtype: int64
- name: language
dtype: string
splits:
- name: pt
num_bytes: 17286508
num_examples: 9104
- name: en
num_bytes: 16708573
num_examples: 9234
- name: es
num_bytes: 16687922
num_examples: 8692
download_size: 17636383
dataset_size: 50683003
configs:
- config_name: default
data_files:
- split: pt
path: data/pt-*
- split: en
path: data/en-*
- split: es
path: data/es-*
---
|
jlh/uci-mushrooms | ---
dataset_info:
features:
- name: poisonous
dtype:
class_label:
names:
'0': e
'1': p
- name: cap-shape
dtype: string
- name: cap-surface
dtype: string
- name: cap-color
dtype: string
- name: bruises
dtype: string
- name: odor
dtype: string
- name: gill-attachment
dtype: string
- name: gill-spacing
dtype: string
- name: gill-size
dtype: string
- name: gill-color
dtype: string
- name: stalk-shape
dtype: string
- name: stalk-root
dtype: string
- name: stalk-surface-above-ring
dtype: string
- name: stalk-surface-below-ring
dtype: string
- name: stalk-color-above-ring
dtype: string
- name: stalk-color-below-ring
dtype: string
- name: veil-type
dtype: string
- name: veil-color
dtype: string
- name: ring-number
dtype: string
- name: ring-type
dtype: string
- name: spore-print-color
dtype: string
- name: population
dtype: string
- name: habitat
dtype: string
splits:
- name: train
num_bytes: 958632
num_examples: 8124
download_size: 90673
dataset_size: 958632
---
# Dataset Card for "uci-mushrooms"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nguyenth1312/zalo_ai_1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 86682675.13
num_examples: 1362
download_size: 84062474
dataset_size: 86682675.13
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AlanYky/subjective-no-instruction-with-symbol | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 735404
num_examples: 500
download_size: 333332
dataset_size: 735404
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_oh-yeontaek__llama-2-7B-LoRA-assemble | ---
pretty_name: Evaluation run of oh-yeontaek/llama-2-7B-LoRA-assemble
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [oh-yeontaek/llama-2-7B-LoRA-assemble](https://huggingface.co/oh-yeontaek/llama-2-7B-LoRA-assemble)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_oh-yeontaek__llama-2-7B-LoRA-assemble\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T23:43:13.966405](https://huggingface.co/datasets/open-llm-leaderboard/details_oh-yeontaek__llama-2-7B-LoRA-assemble/blob/main/results_2023-10-24T23-43-13.966405.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.31596057046979864,\n\
\ \"em_stderr\": 0.004760983364669265,\n \"f1\": 0.39136640100671266,\n\
\ \"f1_stderr\": 0.004644890166719777,\n \"acc\": 0.3674033149171271,\n\
\ \"acc_stderr\": 0.006203274733096429\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.31596057046979864,\n \"em_stderr\": 0.004760983364669265,\n\
\ \"f1\": 0.39136640100671266,\n \"f1_stderr\": 0.004644890166719777\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7348066298342542,\n\
\ \"acc_stderr\": 0.012406549466192858\n }\n}\n```"
repo_url: https://huggingface.co/oh-yeontaek/llama-2-7B-LoRA-assemble
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|arc:challenge|25_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T23_43_13.966405
path:
- '**/details_harness|drop|3_2023-10-24T23-43-13.966405.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T23-43-13.966405.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T23_43_13.966405
path:
- '**/details_harness|gsm8k|5_2023-10-24T23-43-13.966405.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T23-43-13.966405.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hellaswag|10_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T17-57-16.083940.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T17-57-16.083940.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T17-57-16.083940.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T23_43_13.966405
path:
- '**/details_harness|winogrande|5_2023-10-24T23-43-13.966405.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T23-43-13.966405.parquet'
- config_name: results
data_files:
- split: 2023_09_13T17_57_16.083940
path:
- results_2023-09-13T17-57-16.083940.parquet
- split: 2023_10_24T23_43_13.966405
path:
- results_2023-10-24T23-43-13.966405.parquet
- split: latest
path:
- results_2023-10-24T23-43-13.966405.parquet
---
# Dataset Card for Evaluation run of oh-yeontaek/llama-2-7B-LoRA-assemble
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/oh-yeontaek/llama-2-7B-LoRA-assemble
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [oh-yeontaek/llama-2-7B-LoRA-assemble](https://huggingface.co/oh-yeontaek/llama-2-7B-LoRA-assemble) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_oh-yeontaek__llama-2-7B-LoRA-assemble",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T23:43:13.966405](https://huggingface.co/datasets/open-llm-leaderboard/details_oh-yeontaek__llama-2-7B-LoRA-assemble/blob/main/results_2023-10-24T23-43-13.966405.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.31596057046979864,
"em_stderr": 0.004760983364669265,
"f1": 0.39136640100671266,
"f1_stderr": 0.004644890166719777,
"acc": 0.3674033149171271,
"acc_stderr": 0.006203274733096429
},
"harness|drop|3": {
"em": 0.31596057046979864,
"em_stderr": 0.004760983364669265,
"f1": 0.39136640100671266,
"f1_stderr": 0.004644890166719777
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.012406549466192858
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
JovialValley/syllable_totaldataset_4 | ---
dataset_info:
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: label
dtype: string
- name: emotion
dtype: string
- name: emotion_str
dtype: string
splits:
- name: train
num_bytes: 163180696.0
num_examples: 390
- name: test
num_bytes: 41085347.0
num_examples: 97
download_size: 137671411
dataset_size: 204266043.0
---
# Dataset Card for "syllable_totaldataset_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia1 | ---
pretty_name: Evaluation run of SC99/Mistral-7B-privatemix-ia1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SC99/Mistral-7B-privatemix-ia1](https://huggingface.co/SC99/Mistral-7B-privatemix-ia1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-28T23:00:45.925269](https://huggingface.co/datasets/open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia1/blob/main/results_2024-01-28T23-00-45.925269.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6514069537640662,\n\
\ \"acc_stderr\": 0.03224835259879914,\n \"acc_norm\": 0.6505037607853619,\n\
\ \"acc_norm_stderr\": 0.03293066455457689,\n \"mc1\": 0.5716034271725826,\n\
\ \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.7178902486503331,\n\
\ \"mc2_stderr\": 0.014856727473105872\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7141638225255973,\n \"acc_stderr\": 0.01320319608853737,\n\
\ \"acc_norm\": 0.7278156996587031,\n \"acc_norm_stderr\": 0.013006600406423702\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7088229436367257,\n\
\ \"acc_stderr\": 0.004533764686211992,\n \"acc_norm\": 0.8858793069109739,\n\
\ \"acc_norm_stderr\": 0.003173079807440182\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726854,\n \"\
acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726854\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.023610884308927865,\n\
\ \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.023610884308927865\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n\
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368982,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368982\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500097,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500097\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39553072625698327,\n\
\ \"acc_stderr\": 0.016353415410075775,\n \"acc_norm\": 0.39553072625698327,\n\
\ \"acc_norm_stderr\": 0.016353415410075775\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"\
acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n\
\ \"acc_stderr\": 0.01273239828619044,\n \"acc_norm\": 0.46153846153846156,\n\
\ \"acc_norm_stderr\": 0.01273239828619044\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824873,\n\
\ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824873\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5716034271725826,\n\
\ \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.7178902486503331,\n\
\ \"mc2_stderr\": 0.014856727473105872\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.850828729281768,\n \"acc_stderr\": 0.0100125988056273\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6959818043972706,\n \
\ \"acc_stderr\": 0.012670420440198681\n }\n}\n```"
repo_url: https://huggingface.co/SC99/Mistral-7B-privatemix-ia1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|arc:challenge|25_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|gsm8k|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hellaswag|10_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T23-00-45.925269.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T23-00-45.925269.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- '**/details_harness|winogrande|5_2024-01-28T23-00-45.925269.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-28T23-00-45.925269.parquet'
- config_name: results
data_files:
- split: 2024_01_28T23_00_45.925269
path:
- results_2024-01-28T23-00-45.925269.parquet
- split: latest
path:
- results_2024-01-28T23-00-45.925269.parquet
---
# Dataset Card for Evaluation run of SC99/Mistral-7B-privatemix-ia1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SC99/Mistral-7B-privatemix-ia1](https://huggingface.co/SC99/Mistral-7B-privatemix-ia1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-28T23:00:45.925269](https://huggingface.co/datasets/open-llm-leaderboard/details_SC99__Mistral-7B-privatemix-ia1/blob/main/results_2024-01-28T23-00-45.925269.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6514069537640662,
"acc_stderr": 0.03224835259879914,
"acc_norm": 0.6505037607853619,
"acc_norm_stderr": 0.03293066455457689,
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314743,
"mc2": 0.7178902486503331,
"mc2_stderr": 0.014856727473105872
},
"harness|arc:challenge|25": {
"acc": 0.7141638225255973,
"acc_stderr": 0.01320319608853737,
"acc_norm": 0.7278156996587031,
"acc_norm_stderr": 0.013006600406423702
},
"harness|hellaswag|10": {
"acc": 0.7088229436367257,
"acc_stderr": 0.004533764686211992,
"acc_norm": 0.8858793069109739,
"acc_norm_stderr": 0.003173079807440182
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6820512820512821,
"acc_stderr": 0.023610884308927865,
"acc_norm": 0.6820512820512821,
"acc_norm_stderr": 0.023610884308927865
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368982,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500097,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500097
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39553072625698327,
"acc_stderr": 0.016353415410075775,
"acc_norm": 0.39553072625698327,
"acc_norm_stderr": 0.016353415410075775
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.01273239828619044,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.01273239828619044
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824873,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314743,
"mc2": 0.7178902486503331,
"mc2_stderr": 0.014856727473105872
},
"harness|winogrande|5": {
"acc": 0.850828729281768,
"acc_stderr": 0.0100125988056273
},
"harness|gsm8k|5": {
"acc": 0.6959818043972706,
"acc_stderr": 0.012670420440198681
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
D1st3f/Receipts | ---
license: openrail
task_categories:
- text2text-generation
language:
- it
- en
tags:
- finance
pretty_name: tiny_demo
size_categories:
- n<1K
--- |
felipesampaio/meumodelodevoz | ---
license: openrail
---
|
Janiele/vozfilmora | ---
license: openrail
---
|
VinayYadava/vin-orca-custom | ---
license: mit
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 45795
num_examples: 44
download_size: 30534
dataset_size: 45795
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
iamnguyen/ds_by_sys_prompt_2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 787059231.8499715
num_examples: 461461
download_size: 456893655
dataset_size: 787059231.8499715
---
# Dataset Card for "ds_by_sys_prompt_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/shirai_kuroko_toarumajutsunoindex | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Shirai Kuroko
This is the dataset of Shirai Kuroko, containing 208 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 208 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 472 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 208 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 208 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 208 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 208 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 208 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 472 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 472 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 472 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Dahoas/gsm_socratic_conditional | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: response
dtype: string
- name: score_label
dtype: float64
splits:
- name: train
num_bytes: 71960142
num_examples: 50779
- name: val
num_bytes: 355612
num_examples: 256
- name: test
num_bytes: 1910650
num_examples: 1319
download_size: 35356297
dataset_size: 74226404
---
# Dataset Card for "gsm_socratic_conditional"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rshrott/properties6 | ---
dataset_info:
features:
- name: image
dtype: image
- name: labels
dtype:
class_label:
names:
'0': Poor
'1': Fair
'2': Good
'3': Great
'4': Excellent
'5': Not Applicable
splits:
- name: train
num_bytes: 17827628663.32
num_examples: 44368
- name: test
num_bytes: 954213459.76
num_examples: 2244
- name: validation
num_bytes: 1972754616.625
num_examples: 4475
download_size: 20875588311
dataset_size: 20754596739.704998
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
hugfaceguy0001/LightNovels50kto100k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 112043756
num_examples: 493
download_size: 70367662
dataset_size: 112043756
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
diluyedi/testset | ---
dataset_info:
features:
- name: id
dtype: string
- name: prompt
dtype: string
- name: basic_skills
dtype: string
- name: advanced_skills
dtype: string
- name: DALLE_3
dtype: image
- name: DALLE_3_Human
dtype: float
- name: DeepFloyd_I_XL_v1
dtype: image
- name: DeepFloyd_I_XL_v1_Human
dtype: float
- name: Midjourney_6
dtype: image
- name: Midjourney_6_Human
dtype: float
- name: SDXL_2_1
dtype: image
- name: SDXL_2_1_Human
dtype: float
- name: SDXL_Base
dtype: image
- name: SDXL_Base_Human
dtype: float
- name: SDXL_Turbo
dtype: image
- name: SDXL_Turbo_Human
dtype: float
splits:
- name: train
language:
- en
license: apache-2.0
size_categories:
- n<1K
--- |
KenDoStudio/Burnout3_DJStryker | ---
license: bigscience-openrail-m
---
|
open-llm-leaderboard/details_bhenrym14__airophin-v2-13b-PI-8k-fp16 | ---
pretty_name: Evaluation run of bhenrym14/airophin-v2-13b-PI-8k-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bhenrym14/airophin-v2-13b-PI-8k-fp16](https://huggingface.co/bhenrym14/airophin-v2-13b-PI-8k-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bhenrym14__airophin-v2-13b-PI-8k-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T17:43:10.494860](https://huggingface.co/datasets/open-llm-leaderboard/details_bhenrym14__airophin-v2-13b-PI-8k-fp16/blob/main/results_2023-09-22T17-43-10.494860.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0921770134228188,\n\
\ \"em_stderr\": 0.00296245358879876,\n \"f1\": 0.2086210151006714,\n\
\ \"f1_stderr\": 0.0033790655527750446,\n \"acc\": 0.4199589150853921,\n\
\ \"acc_stderr\": 0.009541015115774397\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0921770134228188,\n \"em_stderr\": 0.00296245358879876,\n\
\ \"f1\": 0.2086210151006714,\n \"f1_stderr\": 0.0033790655527750446\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07354056103108415,\n \
\ \"acc_stderr\": 0.007189835754365268\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183525\n\
\ }\n}\n```"
repo_url: https://huggingface.co/bhenrym14/airophin-v2-13b-PI-8k-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T17_43_10.494860
path:
- '**/details_harness|drop|3_2023-09-22T17-43-10.494860.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T17-43-10.494860.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T17_43_10.494860
path:
- '**/details_harness|gsm8k|5_2023-09-22T17-43-10.494860.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T17-43-10.494860.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T17_43_10.494860
path:
- '**/details_harness|winogrande|5_2023-09-22T17-43-10.494860.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T17-43-10.494860.parquet'
- config_name: results
data_files:
- split: 2023_09_22T17_43_10.494860
path:
- results_2023-09-22T17-43-10.494860.parquet
- split: latest
path:
- results_2023-09-22T17-43-10.494860.parquet
---
# Dataset Card for Evaluation run of bhenrym14/airophin-v2-13b-PI-8k-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bhenrym14/airophin-v2-13b-PI-8k-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bhenrym14/airophin-v2-13b-PI-8k-fp16](https://huggingface.co/bhenrym14/airophin-v2-13b-PI-8k-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bhenrym14__airophin-v2-13b-PI-8k-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T17:43:10.494860](https://huggingface.co/datasets/open-llm-leaderboard/details_bhenrym14__airophin-v2-13b-PI-8k-fp16/blob/main/results_2023-09-22T17-43-10.494860.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0921770134228188,
"em_stderr": 0.00296245358879876,
"f1": 0.2086210151006714,
"f1_stderr": 0.0033790655527750446,
"acc": 0.4199589150853921,
"acc_stderr": 0.009541015115774397
},
"harness|drop|3": {
"em": 0.0921770134228188,
"em_stderr": 0.00296245358879876,
"f1": 0.2086210151006714,
"f1_stderr": 0.0033790655527750446
},
"harness|gsm8k|5": {
"acc": 0.07354056103108415,
"acc_stderr": 0.007189835754365268
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183525
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
quocanh34/data_for_synthesis_with_entities_align_v2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: sentence
dtype: string
- name: intent
dtype: string
- name: sentence_annotation
dtype: string
- name: entities
list:
- name: type
dtype: string
- name: filler
dtype: string
- name: file
dtype: string
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: origin_transcription
dtype: string
- name: sentence_norm
dtype: string
- name: w2v2_large_transcription
dtype: string
- name: wer
dtype: int64
- name: entities_norm
list:
- name: filler
dtype: string
- name: type
dtype: string
- name: entities_align
dtype: string
splits:
- name: train
num_bytes: 698110051.1801205
num_examples: 1413
download_size: 158745470
dataset_size: 698110051.1801205
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data_for_synthesis_with_entities_align_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.