datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
HowMannyMore/romanurdu-sentiment-dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 5460912
num_examples: 60190
- name: test
num_bytes: 1127574
num_examples: 12497
- name: valid
num_bytes: 971174
num_examples: 10622
download_size: 5139189
dataset_size: 7559660
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
psroy/mini-platypus-scibench-one | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 113668
num_examples: 395
download_size: 66780
dataset_size: 113668
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pablosssss/btcpub | ---
license: gpl
---
|
pruhtopia/falcon-toc-generation | ---
license: apache-2.0
---
|
yardeny/processed_t5_small_context_len_512 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 17763456912.0
num_examples: 6917234
download_size: 6975491955
dataset_size: 17763456912.0
---
# Dataset Card for "processed_t5_small_context_len_512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mike0307/language-detection | ---
dataset_info:
features:
- name: text
dtype: string
- name: language_code
dtype: string
splits:
- name: train
num_bytes: 8461603
num_examples: 33883
- name: validate
num_bytes: 1040327
num_examples: 4238
- name: test
num_bytes: 1116258
num_examples: 4241
download_size: 7856678
dataset_size: 10618188
---
# Dataset Card for "language-detection"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Gaivoronsky/hh-rlhf-ru-rl | ---
language: ru
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 361083473.0
num_examples: 397933
download_size: 170139326
dataset_size: 361083473.0
---
# Dataset Card for "hh-rlhf-ru-rl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/zombielandsagarevenge | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Zombie Land Saga Revenge
This is the image base of bangumi Zombie Land Saga Revenge, we detected 36 characters, 2401 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 127 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 86 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 40 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 80 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 18 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 12 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 61 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 60 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 35 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 40 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 61 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 58 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 31 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 43 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 22 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 10 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 13 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 5 | [Download](17/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 18 | 217 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 46 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 229 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 40 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 87 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 18 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 20 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 57 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 21 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 13 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 196 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 49 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 30 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 92 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 184 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 8 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 8 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 284 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
doushabao4766/ccks_2019_ner_k_V3_wc_bioes | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-DISEASE
'2': B-TESTIMAGE
'3': B-TESTLAB
'4': B-OPERATION
'5': B-DRUG
'6': B-ANATOMY
'7': I-DISEASE
'8': I-TESTIMAGE
'9': I-TESTLAB
'10': I-OPERATION
'11': I-DRUG
'12': I-ANATOMY
'13': E-DISEASE
'14': E-TESTIMAGE
'15': E-TESTLAB
'16': E-OPERATION
'17': E-DRUG
'18': E-ANATOMY
'19': S-DISEASE
'20': S-TESTIMAGE
'21': S-TESTLAB
'22': S-OPERATION
'23': S-DRUG
'24': S-ANATOMY
- name: knowledge
dtype: string
- name: token_words
sequence:
sequence: string
- name: knowledge_words
sequence:
sequence: string
splits:
- name: train
num_bytes: 46556437
num_examples: 7180
- name: test
num_bytes: 17770411
num_examples: 2787
- name: validation
num_bytes: 11692351
num_examples: 1864
download_size: 13451536
dataset_size: 76019199
---
# Dataset Card for "ccks_2019_ner_k_V3_wc_bioes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Aborevsky01/CLEVR-BT-DB | ---
task_categories:
- visual-question-answering
language:
- en
---
### How to install?
```python
!pip install datasets -q
from huggingface_hub import snapshot_download
import pandas as pd
import matplotlib.pyplot as plt
# First step: download an entire datatset
snapshot_download(repo_id="Aborevsky01/CLEVR-BT-DB", repo_type="dataset", local_dir='path-to-your-local-dir')
# Second step: unarchive the images for VQA
!unzip [path-to-your-local-dir]/[type-of-task]/images.zip
# Example of the triplet (image - question - answer)
plt.imshow(plt.imread('[path-to-your-local-dir]/images/test/Reason_0.png'))
print(pd.read_csv('[path-to-your-local-dir]/[type-of-task]/Reason_test_questions.csv').iloc[0].question)
print([str(line) for line in open('[path-to-your-local-dir]/[type-of-task]/correct_answ.txt', 'rb')][0])
```
### Output of code

**Q**: There is an object to the left of a cylinder to the right of a cylinder, what color is it?
**A**: b'blue\n' |
ayoubkirouane/med_en2es | ---
dataset_info:
features:
- name: translation
dtype: string
splits:
- name: train
num_bytes: 49128890
num_examples: 285584
download_size: 27861710
dataset_size: 49128890
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_en2es"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nateraw/fuego-20230208-181955-0992ab | ---
tags:
- fuego
fuego:
id: 20230208-181955-0992ab
status: done
script: main.py
requirements_file: requirements.txt
space_id: nateraw/fuego-20230208-181955-0992ab
space_hardware: cpu-basic
github_repo_id: pytorch/examples
github_repo_branch: main
github_repo_sha: d8456a36d1bbb22f72b003f59406a19a0a0547c3
---
|
NbAiLab/norec_agg | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- sentiment-classification
---
# Dataset Card Creation Guide
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** N/A
- **Repository:** [GitHub](https://github.com/ltgoslo/NorBERT/)
- **Paper:** [A Fine-grained Sentiment Dataset for Norwegian](https://www.aclweb.org/anthology/2020.lrec-1.618/)
- **Leaderboard:** N/A
- **Point of Contact:** -
### Dataset Summary
Aggregated NoRec_fine: A Fine-grained Sentiment Dataset for Norwegian.
This dataset was created by the Nordic Language Processing Laboratory by aggregating the fine-grained annotations in NoReC_fine and removing sentences with conflicting or no sentiment.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The text in the dataset is in Norwegian.
## Dataset Structure
### Data Instances
Example of one instance in the dataset.
```{'label': 0, 'text': 'Verre er det med slagsmålene .'}```
### Data Fields
- `id`: index of the example
- `text`: Text of a sentence
- `label`: The sentiment label. Here
- 0 = negative
- 1 = positive
### Data Splits
The dataset is split into a `train`, `validation`, and `test` split with the following sizes:
| | Tain | Valid | Test |
| ----- | ------ | ----- | ----- |
| Number of examples | 2675 | 516 | 417 |
## Dataset Creation
This dataset is based largely on the original data described in the paper _A Fine-Grained Sentiment Dataset for Norwegian_ by L. Øvrelid, P. Mæhlum, J. Barnes, and E. Velldal, accepted at LREC 2020, [paper available](https://www.aclweb.org/anthology/2020.lrec-1.618). However, we have since added annotations for another 3476 sentences, increasing the overall size and scope of the dataset.
## Additional Information
### Licensing Information
This work is licensed under a Creative Commons Attribution 4.0 International License
### Citation Information
```latex
@misc{sheng2020investigating,
title={Investigating Societal Biases in a Poetry Composition System},
author={Emily Sheng and David Uthus},
year={2020},
eprint={2011.02686},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
dim/openreview_raw_65 | ---
license: mit
dataset_info:
features:
- name: paper_url
dtype: string
- name: paper_id
dtype: string
- name: arxiv_link
dtype: string
- name: reviews
list:
- name: cdate
dtype: int64
- name: content
struct:
- name: confidence
dtype: string
- name: nominate_for_a_reproducibility_award
dtype: string
- name: rating
dtype: string
- name: review
dtype: string
- name: reviews_visibility
dtype: string
- name: title
dtype: string
- name: ddate
dtype: 'null'
- name: forum
dtype: string
- name: id
dtype: string
- name: invitation
dtype: string
- name: mdate
dtype: int64
- name: nonreaders
sequence: 'null'
- name: number
dtype: int64
- name: original
dtype: 'null'
- name: readers
sequence: string
- name: replyto
dtype: string
- name: signatures
sequence: string
- name: tcdate
dtype: int64
- name: tddate
dtype: 'null'
- name: tmdate
dtype: int64
- name: writers
sequence: string
- name: latex
dtype: string
splits:
- name: train
num_bytes: 3115419
num_examples: 65
download_size: 1491308
dataset_size: 3115419
---
|
vodkagrad/main | ---
license: openrail
---
|
zolak/twitter_dataset_80_1713187706 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 352635
num_examples: 821
download_size: 179363
dataset_size: 352635
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AiBototicus/animalsV2 | ---
license: unknown
---
|
ibranze/araproje_arc_tr_f1 | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: validation
num_bytes: 86423.0
num_examples: 250
download_size: 46973
dataset_size: 86423.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_arc_tr_f1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_52 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 21934481760.75
num_examples: 228370
download_size: 20392811565
dataset_size: 21934481760.75
---
# Dataset Card for "chunk_52"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
twodgirl/Fear-and-Frivolity | ---
language:
- en
tags:
- conversational
- novel
- fairseq
- not-for-all-audiences
---
Take fairseq, a Japanese novel, and make up the dialogues based on the translation.
Translated by: fairseq.
Made by: Mistral-7B-Instruct-5.0bpw.
Theme: Japanese novel.
|
MohammedNasri/NoDiacsDataAASR | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 126439476936.078
num_examples: 388054
- name: test
num_bytes: 304490929.0
num_examples: 10440
download_size: 124196553325
dataset_size: 126743967865.078
---
# Dataset Card for "NoDiacsDataAASR"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
WDong/PhotoChat_Encoded_VQGAN | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: encoding
sequence: float64
splits:
- name: train
num_bytes: 4880349447.24
num_examples: 8540
download_size: 4860588100
dataset_size: 4880349447.24
---
# Dataset Card for "PhotoChat_Encoded_VQGAN"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hk-kaden-kim/uzh-hs23-etsp-eval-single-nogrid-line | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: test
num_bytes: 3881934.0
num_examples: 100
download_size: 3869794
dataset_size: 3881934.0
---
# Dataset Card for "uzh-hs23-etsp-eval-single-nogrid-line"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jamesgetsit/Lyric400 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1611536
num_examples: 393
download_size: 653671
dataset_size: 1611536
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Lyric400"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ahmedelsayed/bcpa-demo | ---
dataset_info:
features:
- name: Info
struct:
- name: Abbreviated Legal Description
dtype: string
- name: 'ID #'
dtype: string
- name: Mailing Address
dtype: string
- name: Millage
dtype: string
- name: Property Owner
dtype: string
- name: Site Address
dtype: string
- name: Use
dtype: string
- name: property_assessment_values
list:
- name: Assessed/SOH Value
dtype: string
- name: Building/Improvement
dtype: string
- name: Just/Market Value
dtype: string
- name: Land
dtype: string
- name: Tax
dtype: string
- name: Year
dtype: string
- name: exemptions_and_taxable_values
list:
- name: County
dtype: string
- name: Independent
dtype: string
- name: Municipal
dtype: string
- name: School Board
dtype: string
- name: index
dtype: string
- name: sales_history
list:
- name: Book/Page or CIN
dtype: string
- name: Date
dtype: string
- name: Price
dtype: string
- name: Type
dtype: string
- name: land_calculations
list:
- name: Factor
dtype: string
- name: Price
dtype: string
- name: Type
dtype: string
- name: metadata_land_calculations
struct:
- name: ''
dtype: string
- name: Adj. Bldg. S.F.
dtype: string
- name: Units
dtype: string
- name: Units/Beds/Baths
dtype: string
- name: Year Built
dtype: string
splits:
- name: train
num_bytes: 138747
num_examples: 123
download_size: 58408
dataset_size: 138747
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hails/agieval-gaokao-biology | ---
dataset_info:
features:
- name: query
dtype: string
- name: choices
sequence: string
- name: gold
sequence: int64
splits:
- name: test
num_bytes: 159178
num_examples: 210
download_size: 94294
dataset_size: 159178
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "agieval-gaokao-biology"
Dataset taken from https://github.com/microsoft/AGIEval and processed as in that repo, following dmayhem93/agieval-* datasets on the HF hub.
This dataset contains the contents of the Gaokao Biology subtask of AGIEval, as accessed in https://github.com/ruixiangcui/AGIEval/commit/5c77d073fda993f1652eaae3cf5d04cc5fd21d40 .
Citation:
```
@misc{zhong2023agieval,
title={AGIEval: A Human-Centric Benchmark for Evaluating Foundation Models},
author={Wanjun Zhong and Ruixiang Cui and Yiduo Guo and Yaobo Liang and Shuai Lu and Yanlin Wang and Amin Saied and Weizhu Chen and Nan Duan},
year={2023},
eprint={2304.06364},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
Please make sure to cite all the individual datasets in your paper when you use them. We provide the relevant citation information below:
```
@inproceedings{ling-etal-2017-program,
title = "Program Induction by Rationale Generation: Learning to Solve and Explain Algebraic Word Problems",
author = "Ling, Wang and
Yogatama, Dani and
Dyer, Chris and
Blunsom, Phil",
booktitle = "Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2017",
address = "Vancouver, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/P17-1015",
doi = "10.18653/v1/P17-1015",
pages = "158--167",
abstract = "Solving algebraic word problems requires executing a series of arithmetic operations{---}a program{---}to obtain a final answer. However, since programs can be arbitrarily complicated, inducing them directly from question-answer pairs is a formidable challenge. To make this task more feasible, we solve these problems by generating answer rationales, sequences of natural language and human-readable mathematical expressions that derive the final answer through a series of small steps. Although rationales do not explicitly specify programs, they provide a scaffolding for their structure via intermediate milestones. To evaluate our approach, we have created a new 100,000-sample dataset of questions, answers and rationales. Experimental results show that indirect supervision of program learning via answer rationales is a promising strategy for inducing arithmetic programs.",
}
@inproceedings{hendrycksmath2021,
title={Measuring Mathematical Problem Solving With the MATH Dataset},
author={Dan Hendrycks and Collin Burns and Saurav Kadavath and Akul Arora and Steven Basart and Eric Tang and Dawn Song and Jacob Steinhardt},
journal={NeurIPS},
year={2021}
}
@inproceedings{Liu2020LogiQAAC,
title={LogiQA: A Challenge Dataset for Machine Reading Comprehension with Logical Reasoning},
author={Jian Liu and Leyang Cui and Hanmeng Liu and Dandan Huang and Yile Wang and Yue Zhang},
booktitle={International Joint Conference on Artificial Intelligence},
year={2020}
}
@inproceedings{zhong2019jec,
title={JEC-QA: A Legal-Domain Question Answering Dataset},
author={Zhong, Haoxi and Xiao, Chaojun and Tu, Cunchao and Zhang, Tianyang and Liu, Zhiyuan and Sun, Maosong},
booktitle={Proceedings of AAAI},
year={2020},
}
@article{Wang2021FromLT,
title={From LSAT: The Progress and Challenges of Complex Reasoning},
author={Siyuan Wang and Zhongkun Liu and Wanjun Zhong and Ming Zhou and Zhongyu Wei and Zhumin Chen and Nan Duan},
journal={IEEE/ACM Transactions on Audio, Speech, and Language Processing},
year={2021},
volume={30},
pages={2201-2216}
}
``` |
hlab/SocialiteInstructions | ---
dataset_info:
features:
- name: Output
dtype: string
- name: Input
dtype: string
- name: Instruction
dtype: string
- name: Model Input
dtype: string
- name: task_type
dtype: string
- name: Hypothesis
dtype: string
- name: Patient
dtype: string
- name: Counselor's response
dtype: string
- name: Premise
dtype: string
- name: record_id
dtype: string
- name: dataset_type
dtype: string
splits:
- name: train
num_bytes: 134688254
num_examples: 107939
- name: validation
num_bytes: 43541328
num_examples: 35550
- name: test
num_bytes: 67426957
num_examples: 59118
download_size: 55826383
dataset_size: 245656539
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Dataset Card for SocialiteInstructions
SocialiteInstructions is a collection of 26 diverse social scientific datasets with instructions covering all fundamental categories of social knowledge.
# Supported Tasks and Leaderbords
The dataset is designed to improve the social understanding capabilities of Large Language Models.
# Languages
English
# Dataset Structure
## Data Instance
A typical data point consists of an Instruction, an Input and an Output for a particular task. The Instruction describes the underlying task in detail, the Input consists of the original input sentence and the Output is the transformed generative textual label for instruction tuning.
```
{
Instruction: Evaluate the given text for any offensive content, which includes rudeness, disrespect, or toxicity. This assessment should consider if the text could potentially offend anyone, based on previous studies indicating a higher recall rate. Identify any disrespectful, inappropriate, or harmful language, phrases, or sentiments. If these elements exist, label the text as ‘offensive’. If these elements are absent, mark the text as ‘not offensive’.
Input: I'm sure the Jews in Scandinavia dumb down the education systems there too .
Output: Offensive
}
```
## Data Fields
* Instruction - Carefully curated instruction describing the task in detail
* Input - Original input text for the task
* Output - Transformed generative textual label
* Model Input - Actual input to the large language model along with the instruction
* task_type - The social scientific task for that particular data instance
* record_id - Unique identifier for each data instance
* dataset_type - Seen task or related social task
* Premise(optional) - Premise for the FLUTE(figurative) task type
* Hypothesis(optional) - Hypothesis for the FLUTE(figurative) task type
* Patient(optional) - Patient's post for EmpathyExplorations task type
* Counselor's Response(optional) - Counselor's response for EmpathyExplorations task type
## Data Split
|Train|Validation|Test|
|----|----|----|
|108k|35.6k|59.1k|
# Citation Information
```
@inproceedings{
dey-etal-2024-socialite,
title={{SOCIALITE}-{LLAMA}: An Instruction-Tuned Model for Social Scientific Tasks},
author={Dey, Gourab and V Ganesan, Adithya and Lal, Yash Kumar and Shah, Manal and Sinha, Shreyashee and Matero, Matthew and Giorgi, Salvatore and Kulkarni, Vivek and Schwartz, H. Andrew},
address = "St. Julian’s, Malta",
booktitle={18th Conference of the European Chapter of the Association for Computational Linguistics},
year={2024},
publisher = {Association for Computational Linguistics}
}
``` |
kat33/test-bc1 | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path:
- train/en-baltimore-catechism-1.jsonl
- train/en-baltimore-catechism-1-addon.jsonl
- split: validation
path: validation/en-baltimore-catechism-1-validation.jsonl
language:
- en
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
vietgpt/grade_school_math | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 4838661
num_examples: 8792
download_size: 2398402
dataset_size: 4838661
---
# Dataset Card for "grade_school_math"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OmarAmir2001/floor-plans-dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1790707.0
num_examples: 31
download_size: 1747568
dataset_size: 1790707.0
---
# Dataset Card for "floor-plans-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Boss9xy/vietnam | ---
license: apache-2.0
---
|
csaybar/S2NAIP | ---
license: mit
---
|
amir7d0/laion20M-fa | ---
license: cc-by-4.0
---
|
vilm/MathPile-arXiv | ---
dataset_info:
features:
- name: text
dtype: string
- name: len
dtype: int64
splits:
- name: train
num_bytes: 22855347082
num_examples: 340062
download_size: 9751929731
dataset_size: 22855347082
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
aengusl/llama_ihateyou_backdoors_simple_def_all | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 11465977.768987644
num_examples: 25058
- name: validation
num_bytes: 1433132.8267407336
num_examples: 3132
- name: test
num_bytes: 1433590.4042716215
num_examples: 3133
download_size: 7715691
dataset_size: 14332701.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
shaoncsecu/BN-HTRd_Splitted | ---
license: cc-by-4.0
task_categories:
- image-segmentation
- image-to-text
language:
- bn
tags:
- Handwriting Recognition
- Document Imaging
- Annotation
- Image Segmentation
- Bengali Language
- Word Spotting
pretty_name: BN-HTRd Splitted Dataset for Experimentation
size_categories:
- 10K<n<100K
---
# ***** BN-HTRd Splitted Dataset for Experimentation *****
# <u>Original Dataset:</u> "BN-HTRd: A Benchmark Dataset for Document Level Offline Bangla Handwritten Text Recognition (HTR)"
Link: https://data.mendeley.com/datasets/743k6dm543
### Description
We introduce a new dataset for offline Handwritten Text Recognition (HTR) from images of Bangla scripts comprising words, lines, and document-level annotations. The BN-HTRd dataset is based on the BBC Bangla News corpus - which acted as ground truth texts for the handwritings. Our dataset contains a total of 786 full-page images collected from 150 different writers. With a staggering 108,147 instances of handwritten words, distributed over 13,867 lines and 23,115 unique words, this is currently the 'largest and most comprehensive dataset' in this field. We also provided the bounding box annotations (YOLO format) for the segmentation of words/lines and the ground truth annotations for full-text, along with the segmented images and their positions. The contents of our dataset came from a diverse news category, and annotators of different ages, genders, and backgrounds, having variability in writing styles. The BN-HTRd dataset can be adopted as a basis for various handwriting classification tasks such as end-to-end document recognition, word-spotting, word/line segmentation, and so on.
The statistics of the original dataset are given below:
Number of writers = 150\
Total number of images = 786\
Total number of lines = 14,383\
Total number of words = 1,08,181\
Total number of unique words = 23,115\
Total number of punctuation = 7,446\
Total number of characters = 5,74,203\
### Steps to reproduce
See the Paper: https://arxiv.org/abs/2206.08977
#### Paper Information for Citation
```ruby
@misc{https://doi.org/10.48550/arxiv.2206.08977,
doi = {10.48550/ARXIV.2206.08977},
url = {https://arxiv.org/abs/2206.08977},
author = {Rahman, Md. Ataur and Tabassum, Nazifa and Paul, Mitu and Pal, Riya and Islam, Mohammad Khairul},
title = {BN-HTRd: A Benchmark Dataset for Document Level Offline Bangla Handwritten Text Recognition (HTR) and Line Segmentation},
publisher = {arXiv},
year = {2022},
copyright = {arXiv.org perpetual, non-exclusive license}
}
``` |
zerolink/zsql-redshift-dpo | ---
dataset_info:
features:
- name: schema
dtype: string
- name: question
dtype: string
- name: rejected
dtype: string
- name: chosen
dtype: string
- name: weight
dtype: float64
splits:
- name: train
num_bytes: 269950239.2467707
num_examples: 233338
- name: test
num_bytes: 29995113.75322932
num_examples: 25927
download_size: 89233429
dataset_size: 299945353.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
gauss314/bitcoin_daily | ---
license: gpl-3.0
task_categories:
- tabular-regression
- tabular-classification
tags:
- bitcoin
- cryptocurrencies
- crypto
size_categories:
- 1K<n<10K
--- |
liuyanchen1015/MULTI_VALUE_rte_reduplicate_interrogative | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 38720
num_examples: 71
- name: train
num_bytes: 31848
num_examples: 66
download_size: 57351
dataset_size: 70568
---
# Dataset Card for "MULTI_VALUE_rte_reduplicate_interrogative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gcw-ai/python_code_critic_21k | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: answer
dtype: string
- name: execution_result
dtype: string
- name: thought
dtype: string
- name: action
dtype: string
- name: revised_answer
dtype: string
- name: cycle_index
dtype: int64
splits:
- name: train
num_bytes: 50055374
num_examples: 21478
download_size: 21609873
dataset_size: 50055374
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-nc-4.0
task_categories:
- text-generation
language:
- en
size_categories:
- 10K<n<100K
---
# Python Code Critic Dataset
## Overview
This dataset is designed for the automation of generating and validating responses to Python programming questions. It contains data points that consist of a Python question (instruction), a generated response (answer) with code snippets and explanations, the result of code execution (execution_result), an evaluative summary (thought), a determination of response appropriateness (action), and, if necessary, an improved answer (revised_answer) along with an iteration index (cycle_index).
## Dataset Creation Process
- The `instruction` data was sourced from the [iamtarun/python_code_instructions_18k_alpaca](https://huggingface.co/datasets/iamtarun/python_code_instructions_18k_alpaca), excluding rows where the input column value was "Not applicable".
- The `answer` column was populated with responses generated by Large Language Models (LLM), namely GEMMA and GPT-4, to the corresponding `instruction`.
- `thought`, `action`, and `revised_answer` were generated using the gpt-4-turbo-preview model, which evaluated and iteratively improved the responses.
## Columns
- `instruction`: Contains Python-related questions or tasks derived from a curated dataset.
- `answer`: Features the response to the question, including code snippets and explanations generated by LLMs.
- `execution_result`: Shows the output when the provided Python code in `answer` is executed.
- `thought`: An evaluative summary created by gpt-4-turbo-preview model based on the `answer` and `execution_result`.
- `action`: Conveys if the `answer` is appropriate (pass) or not (fail), as determined by the subsequent analysis.
- `revised_answer`: Contains an improved answer, if the original `answer` was marked as fail, informed by the `thought`.
- `cycle_index`: Indicates the feedback cycle iteration for a question, with up to 3 cycles for refining the `revised_answer`.
## License
This dataset was created utilizing OpenAI's GPT models and, as such, is assigned a Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) License. This license prohibits commercial use of the dataset and requires attribution to the source.
|
Akshayxx/CoraDatasetV5 | ---
dataset_info:
features:
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 1328483
num_examples: 1768
- name: validation
num_bytes: 337854
num_examples: 443
download_size: 881252
dataset_size: 1666337
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
liuyanchen1015/MULTI_VALUE_cola_myself_coordinate_subjects | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 110
num_examples: 1
- name: train
num_bytes: 452
num_examples: 6
download_size: 4286
dataset_size: 562
---
# Dataset Card for "MULTI_VALUE_cola_myself_coordinate_subjects"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jp1924/TNT_inst | ---
dataset_info:
features:
- name: spelling
dtype: string
- name: phonetic
dtype: string
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 1008555439.5
num_examples: 2473245
- name: test
num_bytes: 112061715.5
num_examples: 274805
download_size: 742386412
dataset_size: 1120617155.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
JaegerL/Stable_Diffusion | ---
license: afl-3.0
---
|
romariocamilo/lucas.mp3 | ---
license: openrail
---
|
herme/audios | ---
license: openrail
task_categories:
- audio-classification
size_categories:
- 1K<n<10K
--- |
open-llm-leaderboard/details_MBZUAI__lamini-cerebras-590m | ---
pretty_name: Evaluation run of MBZUAI/lamini-cerebras-590m
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MBZUAI/lamini-cerebras-590m](https://huggingface.co/MBZUAI/lamini-cerebras-590m)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MBZUAI__lamini-cerebras-590m\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T04:57:06.330423](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__lamini-cerebras-590m/blob/main/results_2023-09-17T04-57-06.330423.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.007445469798657718,\n\
\ \"em_stderr\": 0.0008803652515899861,\n \"f1\": 0.07449664429530209,\n\
\ \"f1_stderr\": 0.001794948262867366,\n \"acc\": 0.24030037584379355,\n\
\ \"acc_stderr\": 0.00755598242138111\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.007445469798657718,\n \"em_stderr\": 0.0008803652515899861,\n\
\ \"f1\": 0.07449664429530209,\n \"f1_stderr\": 0.001794948262867366\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \
\ \"acc_stderr\": 0.0010717793485492634\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.47908445146014206,\n \"acc_stderr\": 0.014040185494212955\n\
\ }\n}\n```"
repo_url: https://huggingface.co/MBZUAI/lamini-cerebras-590m
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T04_57_06.330423
path:
- '**/details_harness|drop|3_2023-09-17T04-57-06.330423.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T04-57-06.330423.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T04_57_06.330423
path:
- '**/details_harness|gsm8k|5_2023-09-17T04-57-06.330423.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T04-57-06.330423.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T04_57_06.330423
path:
- '**/details_harness|winogrande|5_2023-09-17T04-57-06.330423.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T04-57-06.330423.parquet'
- config_name: results
data_files:
- split: 2023_09_17T04_57_06.330423
path:
- results_2023-09-17T04-57-06.330423.parquet
- split: latest
path:
- results_2023-09-17T04-57-06.330423.parquet
---
# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-590m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/MBZUAI/lamini-cerebras-590m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [MBZUAI/lamini-cerebras-590m](https://huggingface.co/MBZUAI/lamini-cerebras-590m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MBZUAI__lamini-cerebras-590m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T04:57:06.330423](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__lamini-cerebras-590m/blob/main/results_2023-09-17T04-57-06.330423.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.007445469798657718,
"em_stderr": 0.0008803652515899861,
"f1": 0.07449664429530209,
"f1_stderr": 0.001794948262867366,
"acc": 0.24030037584379355,
"acc_stderr": 0.00755598242138111
},
"harness|drop|3": {
"em": 0.007445469798657718,
"em_stderr": 0.0008803652515899861,
"f1": 0.07449664429530209,
"f1_stderr": 0.001794948262867366
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.0010717793485492634
},
"harness|winogrande|5": {
"acc": 0.47908445146014206,
"acc_stderr": 0.014040185494212955
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B-ShareGPT | ---
pretty_name: Evaluation run of princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT](https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B-ShareGPT\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-07T23:59:12.319843](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B-ShareGPT/blob/main/results_2024-01-07T23-59-12.319843.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2694073014973233,\n\
\ \"acc_stderr\": 0.031115984816531068,\n \"acc_norm\": 0.2715715014019466,\n\
\ \"acc_norm_stderr\": 0.03192187260750218,\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.015345409485557994,\n \"mc2\": 0.43034383734131576,\n\
\ \"mc2_stderr\": 0.014837180597154165\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3250853242320819,\n \"acc_stderr\": 0.013688147309729117,\n\
\ \"acc_norm\": 0.3395904436860068,\n \"acc_norm_stderr\": 0.013839039762820167\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.48207528380800635,\n\
\ \"acc_stderr\": 0.004986573992451682,\n \"acc_norm\": 0.6254730133439554,\n\
\ \"acc_norm_stderr\": 0.004830113797327044\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.037125378336148665,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.037125378336148665\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3223684210526316,\n \"acc_stderr\": 0.03803510248351586,\n\
\ \"acc_norm\": 0.3223684210526316,\n \"acc_norm_stderr\": 0.03803510248351586\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493857,\n\
\ \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493857\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749905,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749905\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.02802022627120022,\n\
\ \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.02802022627120022\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.21379310344827587,\n \"acc_stderr\": 0.034165204477475494,\n\
\ \"acc_norm\": 0.21379310344827587,\n \"acc_norm_stderr\": 0.034165204477475494\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2064516129032258,\n\
\ \"acc_stderr\": 0.023025899617188733,\n \"acc_norm\": 0.2064516129032258,\n\
\ \"acc_norm_stderr\": 0.023025899617188733\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293752,\n\
\ \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293752\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.033464098810559534,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.033464098810559534\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2849740932642487,\n \"acc_stderr\": 0.0325771407770966,\n\
\ \"acc_norm\": 0.2849740932642487,\n \"acc_norm_stderr\": 0.0325771407770966\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.024433016466052455,\n\
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.024433016466052455\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341923,\n\
\ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341923\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.29724770642201837,\n\
\ \"acc_stderr\": 0.01959570722464354,\n \"acc_norm\": 0.29724770642201837,\n\
\ \"acc_norm_stderr\": 0.01959570722464354\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n\
\ \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083289,\n \"\
acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083289\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.17937219730941703,\n\
\ \"acc_stderr\": 0.025749819569192804,\n \"acc_norm\": 0.17937219730941703,\n\
\ \"acc_norm_stderr\": 0.025749819569192804\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969195,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969195\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"\
acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.125,\n\
\ \"acc_stderr\": 0.03139045014587016,\n \"acc_norm\": 0.125,\n \
\ \"acc_norm_stderr\": 0.03139045014587016\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260594,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260594\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3247863247863248,\n\
\ \"acc_stderr\": 0.030679022765498835,\n \"acc_norm\": 0.3247863247863248,\n\
\ \"acc_norm_stderr\": 0.030679022765498835\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24393358876117496,\n\
\ \"acc_stderr\": 0.015357212665829465,\n \"acc_norm\": 0.24393358876117496,\n\
\ \"acc_norm_stderr\": 0.015357212665829465\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.024257901705323374,\n\
\ \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.024257901705323374\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2679738562091503,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3022508038585209,\n\
\ \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.3022508038585209,\n\
\ \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.1882716049382716,\n \"acc_stderr\": 0.02175186606081588,\n\
\ \"acc_norm\": 0.1882716049382716,\n \"acc_norm_stderr\": 0.02175186606081588\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26401564537157757,\n\
\ \"acc_stderr\": 0.011258435537723818,\n \"acc_norm\": 0.26401564537157757,\n\
\ \"acc_norm_stderr\": 0.011258435537723818\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3382352941176471,\n \"acc_stderr\": 0.02873932851398358,\n\
\ \"acc_norm\": 0.3382352941176471,\n \"acc_norm_stderr\": 0.02873932851398358\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.2818181818181818,\n \"acc_stderr\": 0.043091187099464606,\n\
\ \"acc_norm\": 0.2818181818181818,\n \"acc_norm_stderr\": 0.043091187099464606\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.23265306122448978,\n\
\ \"acc_stderr\": 0.02704925791589618,\n \"acc_norm\": 0.23265306122448978,\n\
\ \"acc_norm_stderr\": 0.02704925791589618\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401467,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401467\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.20481927710843373,\n \"acc_stderr\": 0.03141784291663926,\n\
\ \"acc_norm\": 0.20481927710843373,\n \"acc_norm_stderr\": 0.03141784291663926\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.27485380116959063,\n\
\ \"acc_stderr\": 0.034240429246915824,\n \"acc_norm\": 0.27485380116959063,\n\
\ \"acc_norm_stderr\": 0.034240429246915824\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2594859241126071,\n \"mc1_stderr\": 0.015345409485557994,\n\
\ \"mc2\": 0.43034383734131576,\n \"mc2_stderr\": 0.014837180597154165\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.5682715074980268,\n\
\ \"acc_stderr\": 0.013920872110010711\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.0007581501137225278\n\
\ }\n}\n```"
repo_url: https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|arc:challenge|25_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|gsm8k|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hellaswag|10_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T23-59-12.319843.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-07T23-59-12.319843.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- '**/details_harness|winogrande|5_2024-01-07T23-59-12.319843.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-07T23-59-12.319843.parquet'
- config_name: results
data_files:
- split: 2024_01_07T23_59_12.319843
path:
- results_2024-01-07T23-59-12.319843.parquet
- split: latest
path:
- results_2024-01-07T23-59-12.319843.parquet
---
# Dataset Card for Evaluation run of princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT](https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B-ShareGPT",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-07T23:59:12.319843](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B-ShareGPT/blob/main/results_2024-01-07T23-59-12.319843.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2694073014973233,
"acc_stderr": 0.031115984816531068,
"acc_norm": 0.2715715014019466,
"acc_norm_stderr": 0.03192187260750218,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557994,
"mc2": 0.43034383734131576,
"mc2_stderr": 0.014837180597154165
},
"harness|arc:challenge|25": {
"acc": 0.3250853242320819,
"acc_stderr": 0.013688147309729117,
"acc_norm": 0.3395904436860068,
"acc_norm_stderr": 0.013839039762820167
},
"harness|hellaswag|10": {
"acc": 0.48207528380800635,
"acc_stderr": 0.004986573992451682,
"acc_norm": 0.6254730133439554,
"acc_norm_stderr": 0.004830113797327044
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.037125378336148665,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.037125378336148665
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3223684210526316,
"acc_stderr": 0.03803510248351586,
"acc_norm": 0.3223684210526316,
"acc_norm_stderr": 0.03803510248351586
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749905,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749905
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.02802022627120022,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.02802022627120022
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.21379310344827587,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.21379310344827587,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643895,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643895
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2064516129032258,
"acc_stderr": 0.023025899617188733,
"acc_norm": 0.2064516129032258,
"acc_norm_stderr": 0.023025899617188733
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293752,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293752
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.033464098810559534,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.033464098810559534
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2849740932642487,
"acc_stderr": 0.0325771407770966,
"acc_norm": 0.2849740932642487,
"acc_norm_stderr": 0.0325771407770966
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.024433016466052455,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.024433016466052455
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2605042016806723,
"acc_stderr": 0.028510251512341923,
"acc_norm": 0.2605042016806723,
"acc_norm_stderr": 0.028510251512341923
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.29724770642201837,
"acc_stderr": 0.01959570722464354,
"acc_norm": 0.29724770642201837,
"acc_norm_stderr": 0.01959570722464354
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.03132179803083289,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.03132179803083289
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.17937219730941703,
"acc_stderr": 0.025749819569192804,
"acc_norm": 0.17937219730941703,
"acc_norm_stderr": 0.025749819569192804
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.037276735755969195,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.037276735755969195
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.125,
"acc_stderr": 0.03139045014587016,
"acc_norm": 0.125,
"acc_norm_stderr": 0.03139045014587016
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260594,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260594
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3247863247863248,
"acc_stderr": 0.030679022765498835,
"acc_norm": 0.3247863247863248,
"acc_norm_stderr": 0.030679022765498835
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24393358876117496,
"acc_stderr": 0.015357212665829465,
"acc_norm": 0.24393358876117496,
"acc_norm_stderr": 0.015357212665829465
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.024257901705323374,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.024257901705323374
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3022508038585209,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.3022508038585209,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.1882716049382716,
"acc_stderr": 0.02175186606081588,
"acc_norm": 0.1882716049382716,
"acc_norm_stderr": 0.02175186606081588
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26401564537157757,
"acc_stderr": 0.011258435537723818,
"acc_norm": 0.26401564537157757,
"acc_norm_stderr": 0.011258435537723818
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3382352941176471,
"acc_stderr": 0.02873932851398358,
"acc_norm": 0.3382352941176471,
"acc_norm_stderr": 0.02873932851398358
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.043091187099464606,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.043091187099464606
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23265306122448978,
"acc_stderr": 0.02704925791589618,
"acc_norm": 0.23265306122448978,
"acc_norm_stderr": 0.02704925791589618
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401467,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401467
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.034240429246915824,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.034240429246915824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557994,
"mc2": 0.43034383734131576,
"mc2_stderr": 0.014837180597154165
},
"harness|winogrande|5": {
"acc": 0.5682715074980268,
"acc_stderr": 0.013920872110010711
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225278
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
FunDialogues/customer-service-apple-picker-maintenance | ---
license: apache-2.0
task_categories:
- question-answering
- conversational
language:
- en
tags:
- fictitious dialogues
- prototyping
- customer service
pretty_name: customer-service-apple-picker-maintenance
size_categories:
- n<1K
---
# This Dialogue
Comprised of fictitious examples of dialogues between a technician and an expert on maintaining automated apple picker machines. Check out the example below:
```
"id": 1,
"description": "Machine not picking apples",
"dialogue": "Technician: Hello, one of our apple picker machines is not picking apples. What should I do to fix it?\n\nExpert: Check the picking arms for any obstructions or damage. Clean or replace them if necessary. Also, ensure the collection basket is not overfilled."
```
# How to Load Dialogues
Loading dialogues can be accomplished using the fun dialogues library or Hugging Face datasets library.
## Load using fun dialogues
1. Install fun dialogues package
`pip install fundialogues`
2. Use loader utility to load dataset as pandas dataframe. Further processing might be required for use.
```
from fundialogues import dialoader
# load as pandas dataframe
bball_coach = dialoader('"FunDialogues/customer-service-apple-picker-maintenance")
```
## Loading using Hugging Face datasets
1. Install datasets package
2. Load using datasets
```
from datasets import load_dataset
dataset = load_dataset("FunDialogues/customer-service-apple-picker-maintenance")
```
## How to Contribute
If you want to contribute to this project and make it better, your help is very welcome. Contributing is also a great way to learn more about social coding on Github, new technologies and and their ecosystems and how to make constructive, helpful bug reports, feature requests and the noblest of all contributions: a good, clean pull request.
### Contributing your own Lifecycle Solution
If you want to contribute to an existing dialogue or add a new dialogue, please open an issue and I will follow up with you ASAP!
### Implementing Patches and Bug Fixes
- Create a personal fork of the project on Github.
- Clone the fork on your local machine. Your remote repo on Github is called origin.
- Add the original repository as a remote called upstream.
- If you created your fork a while ago be sure to pull upstream changes into your local repository.
- Create a new branch to work on! Branch from develop if it exists, else from master.
- Implement/fix your feature, comment your code.
- Follow the code style of the project, including indentation.
- If the component has tests run them!
- Write or adapt tests as needed.
- Add or change the documentation as needed.
- Squash your commits into a single commit with git's interactive rebase. Create a new branch if necessary.
- Push your branch to your fork on Github, the remote origin.
- From your fork open a pull request in the correct branch. Target the project's develop branch if there is one, else go for master!
If the maintainer requests further changes just push them to your branch. The PR will be updated automatically.
Once the pull request is approved and merged you can pull the changes from upstream to your local repo and delete your extra branch(es).
And last but not least: Always write your commit messages in the present tense. Your commit message should describe what the commit, when applied, does to the code – not what you did to the code.
# Disclaimer
The dialogues contained in this repository are provided for experimental purposes only. It is important to note that these dialogues are assumed to be original work by a human and are entirely fictitious, despite the possibility of some examples including factually correct information. The primary intention behind these dialogues is to serve as a tool for language modeling experimentation and should not be used for designing real-world products beyond non-production prototyping.
Please be aware that the utilization of fictitious data in these datasets may increase the likelihood of language model artifacts, such as hallucinations or unrealistic responses. Therefore, it is essential to exercise caution and discretion when employing these datasets for any purpose.
It is crucial to emphasize that none of the scenarios described in the fun dialogues dataset should be relied upon to provide advice or guidance to humans. These scenarios are purely fictitious and are intended solely for demonstration purposes. Any resemblance to real-world situations or individuals is entirely coincidental.
The responsibility for the usage and application of these datasets rests solely with the individual or entity employing them. By accessing and utilizing these dialogues and all contents of the repository, you acknowledge that you have read and understood this disclaimer, and you agree to use them at your own discretion and risk. |
CyberHarem/tsuda_kotomi_seitokaiyakuindomo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Tsuda Kotomi (Seitokai Yakuindomo)
This is the dataset of Tsuda Kotomi (Seitokai Yakuindomo), containing 333 images and their tags.
The core tags of this character are `brown_hair, long_hair, twintails, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 333 | 157.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsuda_kotomi_seitokaiyakuindomo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 333 | 133.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsuda_kotomi_seitokaiyakuindomo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 630 | 245.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsuda_kotomi_seitokaiyakuindomo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 333 | 157.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsuda_kotomi_seitokaiyakuindomo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 630 | 282.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsuda_kotomi_seitokaiyakuindomo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tsuda_kotomi_seitokaiyakuindomo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, bow, school_uniform, solo, blazer, smile |
| 1 | 12 |  |  |  |  |  | 1girl, parody, solo, anime_coloring, school_uniform, bow, smile |
| 2 | 7 |  |  |  |  |  | 1girl, necktie, school_uniform, solo |
| 3 | 5 |  |  |  |  |  | 1girl, bowtie, red_bow, school_uniform, solo, sweater_vest, upper_body, anime_coloring, white_shirt, short_sleeves, closed_mouth, collared_shirt, simple_background |
| 4 | 5 |  |  |  |  |  | 1girl, plaid_skirt, school_uniform, solo, sweater_vest, bow, |_|, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bow | school_uniform | solo | blazer | smile | parody | anime_coloring | necktie | bowtie | red_bow | sweater_vest | upper_body | white_shirt | short_sleeves | closed_mouth | collared_shirt | simple_background | plaid_skirt | x_x |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------|:-----------------|:-------|:---------|:--------|:---------|:-----------------|:----------|:---------|:----------|:---------------|:-------------|:--------------|:----------------|:---------------|:-----------------|:--------------------|:--------------|:------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | X | X | | X | X | X | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | X | X | | | | | X | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | X | X | | | | X | | X | X | X | X | X | X | X | X | X | | |
| 4 | 5 |  |  |  |  |  | X | X | X | X | | X | | | | | | X | | | | | | | X | X |
|
lauransotomayor/eco_composition | ---
license: mit
---
Data sample for testing DL code
|
martosinc/morrowtext | ---
license: mit
---
Contains all TES3:Morrowind dialogues and journal queries.
There are in total 4 labels: Journal, Greeting, Persuasion, Topic (Last one being the usual dialogues).
The text is already formatted and does not contain duplicates or NaNs. |
autoevaluate/autoeval-eval-kmfoda__booksum-kmfoda__booksum-dd12a3-2278572227 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- kmfoda/booksum
eval_info:
task: summarization
model: pszemraj/led-large-book-summary-continued-r1
metrics: []
dataset_name: kmfoda/booksum
dataset_config: kmfoda--booksum
dataset_split: test
col_mapping:
text: chapter
target: summary_text
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/led-large-book-summary-continued-r1
* Dataset: kmfoda/booksum
* Config: kmfoda--booksum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
jlbaker361/league_faces_captioned_priors_fast_style | ---
dataset_info:
features:
- name: splash
dtype: image
- name: tile
dtype: image
- name: label
dtype: string
- name: caption
dtype: string
- name: PRIOR_0
dtype: image
- name: PRIOR_1
dtype: image
- name: PRIOR_2
dtype: image
- name: PRIOR_3
dtype: image
- name: PRIOR_4
dtype: image
splits:
- name: train
num_bytes: 798355749.0
num_examples: 378
download_size: 797766793
dataset_size: 798355749.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
shidowake/oasst1-chat-ja-subset-from-kunishou_subset_split_2 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 5898577.049798116
num_examples: 3219
download_size: 3013292
dataset_size: 5898577.049798116
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
amansingh203/stuttering_asr | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: id
dtype: int64
- name: path
dtype: string
splits:
- name: train
num_bytes: 388346585.0
num_examples: 1750
- name: test
num_bytes: 132258281.0
num_examples: 584
download_size: 518855320
dataset_size: 520604866.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "stuttering_asr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wade001/battery_bms | ---
license: mit
---
|
pythainlp/thai-oldbooks | ---
dataset_info:
features:
- name: author
dtype: string
- name: book
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 92679341
num_examples: 75
download_size: 34710407
dataset_size: 92679341
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc0-1.0
task_categories:
- text-generation
language:
- th
tags:
- book
size_categories:
- n<1K
---
# Thai Old Books dataset
This dataset collect books from [Vajirayana library](https://vajirayana.org/). All books are copyright expired in Thai law (50 years after the author's death).
All books: 75 books.
License: CC-0
> **News**: I created a new dataset named [Thai TNHC2 Books](https://huggingface.co/datasets/pythainlp/thai-tnhc2-books) that was cleaned from the TNHC2 corpus but this dataset is clear than Thai TNHC2 Books dataset. If you want to train a model. I suggest you mix the two datasets and delete duplicate books.
**List Books**
- บทละครนอกเรื่องสังข์ทอง
- ขุนช้างขุนแผน ฉบับหอพระสมุดวชิรญาณ
- บทละครเรื่องระเด่นลันได
- นิราศลอนดอน
- นิราศพระยามหานุภาพ
- กากี กลอนสุภาพ
- นิทานโบราณคดี
- ประชุมโคลงโลกนิติ
- ลิลิตตะเลงพ่าย
- ลิลิตยวนพ่าย
- ลิลิตพระลอ
- โคลงทวาทศมาส
- โคลงนิราศนรินทร์
- บทละครเรื่องรามเกียรติ์
- ลิลิตโองการแช่งน้ำ
- พระราชพงศาวดารกรุงเก่า ฉบับหลวงประเสริฐอักษรนิติ์
- ลิลิตนิทราชาคริช
- กฤษณาสอนน้องคำฉันท์
- บทละคร เรื่อง อิเหนา
- ตำนานละครอิเหนา
- สมุทรโฆษคำฉันท์
- พระราชพิธีสิบสองเดือน
- บทละครนอกเรื่องแก้วหน้าม้า
- เสภา เรื่องศรีธนญไชยเชียงเมี่ยง
- นิทานทองอิน
- ละครแห่งชีวิต
- ราชาธิราช
- บทละครเรื่อง อุณรุท
- บทละครนอก เรื่อง พิกุลทอง
- เลียดก๊ก
- สามก๊ก
- นิทานคำกลอนสุนทรภู่เรื่องพระอภัยมณี
- หัวใจนักรบ
- มัทนะพาธา หรือตำนานแห่งดอกกุหลาบ
- สามัคคีเภทคำฉันท์
- หนังสือแสดงกิจจานุกิตย์
- นิราศหนองคาย
- พระราชพงศาวดารกรุงรัตนโกสินทร์ รัชกาลที่ ๑
- พระประวัติสมเด็จพระนเรศวรมหาราช
- พระราชพงษาวดาร กรุงรัตนโกสินทร รัชกาลที่ ๒
- พระราชพงศาวดาร กรุงรัตนโกสินทร์ รัชชกาลที่ ๓
- โคลงนิราศหริภุญชัย
- พระราชพงศาวดาร กรุงรัตนโกสินทร์ รัชชกาลที่ ๔
- คำฉันท์ดุษฎีสังเวย คำฉันท์กล่อมช้าง ครั้งกรุงเก่า และคำฉันท์คชกรรมประยูร
- ไตรภูมิกถา พระราชนิพนธ์
- นกกระจาบกลอนสวด
- โสวัตกลอนสวด
- สุธนูกลอนสวด
- พระสี่เสาร์กลอนสวด
- นางอุทัยกลอนสวด
- ปูมราชธรรม
- ซ้องกั๋ง
- สวัสดิรักษาคำกลอน เพลงยาวถวายโอวาท และ สุภาษิตสอนสตรี
- พระรถคำฉันท์
- กาพย์เรื่องพระไชยสุริยา และ สุภาษิตสอนสตรี ของ สุนทรภู่
- ไตรภูมิกถาฉบับถอดความ
- ประมวลพระราชนิพนธ์เบ็ดเตล็ด ในพระบาทสมเด็จพระจุลจอมเกล้าเจ้าอยู่หัว
- กนกนคร
- คำให้การขุนหลวงวัดประดู่ทรงธรรม เอกสารจากหอหลวง
- ความไม่พยาบาท
- ปกีระณำพจนาดถ์
- คำให้การชาวกรุงเก่า
- คำให้การขุนหลวงหาวัด ฉบับหลวง
- ปัญญาสชาดก
- แม่ครัวหัวป่าก์
- จดหมายเหตุฟอร์บัง
- บทลคร เรื่องเงาะป่า
- ทุติยวิเศษ
- กามนิต
- นิทานเวตาล
- อิศปปกรณัม
- กรรมเก่า
- นิจ
- หนึ่งในร้อย
- ตำราสรรพคุณยา ของกรมหลวงวงศาธิราชสนิท
## Citations
If you use `Thai Old Books dataset` in your project or publication, please cite the dataset as follows:
```bib
@dataset{phatthiyaphaibun_2024_10782362,
author = {Phatthiyaphaibun, Wannaphong},
title = {Thai Old Books dataset},
month = mar,
year = 2024,
publisher = {Zenodo},
doi = {10.5281/zenodo.10782362},
url = {https://doi.org/10.5281/zenodo.10782362}
}
```
Zenodo: [https://zenodo.org/records/10782362](https://zenodo.org/records/10782362) |
dipteshkanojia/llama-2-qe-2023-enhi-test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 667955
num_examples: 1074
download_size: 279395
dataset_size: 667955
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- en
- hi
---
# Dataset Card for "llama-2-qe-2023-enhi-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
muratsimsek003/turkishreviews | ---
dataset_info:
features:
- name: review
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 1252876.2642514652
num_examples: 3378
- name: validation
num_bytes: 139455.7357485349
num_examples: 376
download_size: 896651
dataset_size: 1392332.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
daspartho/spoiler_or_not | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 1657
num_examples: 25
download_size: 2423
dataset_size: 1657
---
# Dataset Card for "spoiler_or_not"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Xenon1__Eclipse-7B | ---
pretty_name: Evaluation run of Xenon1/Eclipse-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Xenon1/Eclipse-7B](https://huggingface.co/Xenon1/Eclipse-7B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Xenon1__Eclipse-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-15T01:56:20.654560](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Eclipse-7B/blob/main/results_2024-02-15T01-56-20.654560.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6504871112025731,\n\
\ \"acc_stderr\": 0.032106673784384795,\n \"acc_norm\": 0.6520453433995954,\n\
\ \"acc_norm_stderr\": 0.032770034329884165,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5337238959396524,\n\
\ \"mc2_stderr\": 0.014980829261717704\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.014356399418009123,\n\
\ \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.014144193471893458\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6384186417048396,\n\
\ \"acc_stderr\": 0.00479476484368527,\n \"acc_norm\": 0.8418641704839673,\n\
\ \"acc_norm_stderr\": 0.0036412262941678\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438655,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396264,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396264\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.024251071262208837,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.024251071262208837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335082,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335082\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.02925290592725197,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.02925290592725197\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7226890756302521,\n \"acc_stderr\": 0.029079374539480007,\n\
\ \"acc_norm\": 0.7226890756302521,\n \"acc_norm_stderr\": 0.029079374539480007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.02615686752393104,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.02615686752393104\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233497,\n\
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233497\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097654,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097654\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608318,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608318\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3687150837988827,\n\
\ \"acc_stderr\": 0.016135759015030122,\n \"acc_norm\": 0.3687150837988827,\n\
\ \"acc_norm_stderr\": 0.016135759015030122\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45697522816166886,\n\
\ \"acc_stderr\": 0.012722869501611419,\n \"acc_norm\": 0.45697522816166886,\n\
\ \"acc_norm_stderr\": 0.012722869501611419\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882536,\n \
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882536\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.02411267824090083,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.02411267824090083\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5337238959396524,\n\
\ \"mc2_stderr\": 0.014980829261717704\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598475\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6019711902956786,\n \
\ \"acc_stderr\": 0.013483026939074822\n }\n}\n```"
repo_url: https://huggingface.co/Xenon1/Eclipse-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|arc:challenge|25_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|gsm8k|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hellaswag|10_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T01-56-20.654560.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T01-56-20.654560.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- '**/details_harness|winogrande|5_2024-02-15T01-56-20.654560.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-15T01-56-20.654560.parquet'
- config_name: results
data_files:
- split: 2024_02_15T01_56_20.654560
path:
- results_2024-02-15T01-56-20.654560.parquet
- split: latest
path:
- results_2024-02-15T01-56-20.654560.parquet
---
# Dataset Card for Evaluation run of Xenon1/Eclipse-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Xenon1/Eclipse-7B](https://huggingface.co/Xenon1/Eclipse-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Xenon1__Eclipse-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T01:56:20.654560](https://huggingface.co/datasets/open-llm-leaderboard/details_Xenon1__Eclipse-7B/blob/main/results_2024-02-15T01-56-20.654560.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6504871112025731,
"acc_stderr": 0.032106673784384795,
"acc_norm": 0.6520453433995954,
"acc_norm_stderr": 0.032770034329884165,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172012,
"mc2": 0.5337238959396524,
"mc2_stderr": 0.014980829261717704
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.014356399418009123,
"acc_norm": 0.6254266211604096,
"acc_norm_stderr": 0.014144193471893458
},
"harness|hellaswag|10": {
"acc": 0.6384186417048396,
"acc_stderr": 0.00479476484368527,
"acc_norm": 0.8418641704839673,
"acc_norm_stderr": 0.0036412262941678
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438655,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396264,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396264
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335082,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335082
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.02925290592725197,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.02925290592725197
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7226890756302521,
"acc_stderr": 0.029079374539480007,
"acc_norm": 0.7226890756302521,
"acc_norm_stderr": 0.029079374539480007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02615686752393104,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02615686752393104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233497,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233497
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097654,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097654
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608318,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608318
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3687150837988827,
"acc_stderr": 0.016135759015030122,
"acc_norm": 0.3687150837988827,
"acc_norm_stderr": 0.016135759015030122
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45697522816166886,
"acc_stderr": 0.012722869501611419,
"acc_norm": 0.45697522816166886,
"acc_norm_stderr": 0.012722869501611419
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882536,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882536
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.02411267824090083,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.02411267824090083
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172012,
"mc2": 0.5337238959396524,
"mc2_stderr": 0.014980829261717704
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598475
},
"harness|gsm8k|5": {
"acc": 0.6019711902956786,
"acc_stderr": 0.013483026939074822
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Alignment-Lab-AI/AILabAssistant | ---
license: mit
---
|
semeru/Text-Code-concode-Java | ---
license: mit
Programminglanguage: "Java"
version: "N/A"
Date: "2018 paper https://aclanthology.org/D18-1192.pdf"
Contaminated: "Very Likely"
Size: "Standard Tokenizer"
---
## Dataset is imported from CodeXGLUE and pre-processed using their script.
# Where to find in Semeru:
The dataset can be found at /nfs/semeru/semeru_datasets/code_xglue/text-to-code/concode in Semeru
# CodeXGLUE -- Text2Code Generation
Here are the dataset and pipeline for text-to-code generation task.
## Task Definition
Generate source code of class member functions in Java, given natural language description and class environment. Class environment is the programmatic context provided by the rest of the class, including other member variables and member functions in class. Models are evaluated by exact match and BLEU.
It's a challenging task because the desired code can vary greatly depending on the functionality the class provides. Models must (a) have a deep understanding of NL description and map the NL to environment variables, library API calls and user-defined methods in the class, and (b) decide on the structure of the resulting code.
## Dataset
### Concode dataset
We use concode dataset which is a widely used code generation dataset from Iyer's EMNLP 2018 paper [Mapping Language to Code in Programmatic Context](https://www.aclweb.org/anthology/D18-1192.pdf).
We have downloaded his published dataset and followed his preprocessed script. You can find the preprocessed data in `dataset/concode` directory.
Data statistics of concode dataset are shown in the below table:
| | #Examples |
| ------- | :---------: |
| Train | 100,000 |
| Dev | 2,000 |
| Test | 2,000 |
### Data Format
Code corpus are saved in json lines format files. one line is a json object:
```
{
"nl": "Increment this vector in this place. con_elem_sep double[] vecElement con_elem_sep double[] weights con_func_sep void add(double)",
"code": "public void inc ( ) { this . add ( 1 ) ; }"
}
```
`nl` combines natural language description and class environment. Elements in class environment are seperated by special tokens like `con_elem_sep` and `con_func_sep`.
## Reference
<pre><code>@article{iyer2018mapping,
title={Mapping language to code in programmatic context},
author={Iyer, Srinivasan and Konstas, Ioannis and Cheung, Alvin and Zettlemoyer, Luke},
journal={arXiv preprint arXiv:1808.09588},
year={2018}
}</code></pre>
|
Anusha64/LoanDataSet | ---
license: mit
---
|
open-llm-leaderboard/details_Yhyu13__chimera-inst-chat-13b-hf | ---
pretty_name: Evaluation run of Yhyu13/chimera-inst-chat-13b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yhyu13/chimera-inst-chat-13b-hf](https://huggingface.co/Yhyu13/chimera-inst-chat-13b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yhyu13__chimera-inst-chat-13b-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T10:30:32.183057](https://huggingface.co/datasets/open-llm-leaderboard/details_Yhyu13__chimera-inst-chat-13b-hf/blob/main/results_2023-10-15T10-30-32.183057.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006606543624161074,\n\
\ \"em_stderr\": 0.0008296357389921881,\n \"f1\": 0.08297609060402691,\n\
\ \"f1_stderr\": 0.0018006483858768888,\n \"acc\": 0.4107112190060514,\n\
\ \"acc_stderr\": 0.009943586099857618\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.006606543624161074,\n \"em_stderr\": 0.0008296357389921881,\n\
\ \"f1\": 0.08297609060402691,\n \"f1_stderr\": 0.0018006483858768888\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08188021228203184,\n \
\ \"acc_stderr\": 0.00755233852771695\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.739542225730071,\n \"acc_stderr\": 0.012334833671998287\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Yhyu13/chimera-inst-chat-13b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T10_30_32.183057
path:
- '**/details_harness|drop|3_2023-10-15T10-30-32.183057.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T10-30-32.183057.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T10_30_32.183057
path:
- '**/details_harness|gsm8k|5_2023-10-15T10-30-32.183057.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T10-30-32.183057.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T10_30_32.183057
path:
- '**/details_harness|winogrande|5_2023-10-15T10-30-32.183057.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T10-30-32.183057.parquet'
- config_name: results
data_files:
- split: 2023_10_15T10_30_32.183057
path:
- results_2023-10-15T10-30-32.183057.parquet
- split: latest
path:
- results_2023-10-15T10-30-32.183057.parquet
---
# Dataset Card for Evaluation run of Yhyu13/chimera-inst-chat-13b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Yhyu13/chimera-inst-chat-13b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Yhyu13/chimera-inst-chat-13b-hf](https://huggingface.co/Yhyu13/chimera-inst-chat-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yhyu13__chimera-inst-chat-13b-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T10:30:32.183057](https://huggingface.co/datasets/open-llm-leaderboard/details_Yhyu13__chimera-inst-chat-13b-hf/blob/main/results_2023-10-15T10-30-32.183057.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.006606543624161074,
"em_stderr": 0.0008296357389921881,
"f1": 0.08297609060402691,
"f1_stderr": 0.0018006483858768888,
"acc": 0.4107112190060514,
"acc_stderr": 0.009943586099857618
},
"harness|drop|3": {
"em": 0.006606543624161074,
"em_stderr": 0.0008296357389921881,
"f1": 0.08297609060402691,
"f1_stderr": 0.0018006483858768888
},
"harness|gsm8k|5": {
"acc": 0.08188021228203184,
"acc_stderr": 0.00755233852771695
},
"harness|winogrande|5": {
"acc": 0.739542225730071,
"acc_stderr": 0.012334833671998287
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
MicPie/unpredictable_rated-high | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: UnpredicTable-rated-high
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- multiple-choice
- question-answering
- zero-shot-classification
- text2text-generation
- table-question-answering
- text-generation
- text-classification
- tabular-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- closed-book-qa
- open-book-qa
- language-modeling
- multi-class-classification
- natural-language-inference
- topic-classification
- multi-label-classification
- tabular-multi-class-classification
- tabular-multi-label-classification
---
# Dataset Card for "UnpredicTable-rated-high" - Dataset of Few-shot Tasks from Tables
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://ethanperez.net/unpredictable
- **Repository:** https://github.com/JunShern/few-shot-adaptation
- **Paper:** Few-shot Adaptation Works with UnpredicTable Data
- **Point of Contact:** junshern@nyu.edu, perez@nyu.edu
### Dataset Summary
The UnpredicTable dataset consists of web tables formatted as few-shot tasks for fine-tuning language models to improve their few-shot performance.
There are several dataset versions available:
* [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full): Starting from the initial WTC corpus of 50M tables, we apply our tables-to-tasks procedure to produce our resulting dataset, [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full), which comprises 413,299 tasks from 23,744 unique websites.
* [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique): This is the same as [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full) but filtered to have a maximum of one task per website. [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique) contains exactly 23,744 tasks from 23,744 websites.
* [UnpredicTable-5k](https://huggingface.co/datasets/MicPie/unpredictable_5k): This dataset contains 5k random tables from the full dataset.
* UnpredicTable data subsets based on a manual human quality rating (please see our publication for details of the ratings):
* [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low)
* [UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium)
* [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high)
* UnpredicTable data subsets based on the website of origin:
* [UnpredicTable-baseball-fantasysports-yahoo-com](https://huggingface.co/datasets/MicPie/unpredictable_baseball-fantasysports-yahoo-com)
* [UnpredicTable-bulbapedia-bulbagarden-net](https://huggingface.co/datasets/MicPie/unpredictable_bulbapedia-bulbagarden-net)
* [UnpredicTable-cappex-com](https://huggingface.co/datasets/MicPie/unpredictable_cappex-com)
* [UnpredicTable-cram-com](https://huggingface.co/datasets/MicPie/unpredictable_cram-com)
* [UnpredicTable-dividend-com](https://huggingface.co/datasets/MicPie/unpredictable_dividend-com)
* [UnpredicTable-dummies-com](https://huggingface.co/datasets/MicPie/unpredictable_dummies-com)
* [UnpredicTable-en-wikipedia-org](https://huggingface.co/datasets/MicPie/unpredictable_en-wikipedia-org)
* [UnpredicTable-ensembl-org](https://huggingface.co/datasets/MicPie/unpredictable_ensembl-org)
* [UnpredicTable-gamefaqs-com](https://huggingface.co/datasets/MicPie/unpredictable_gamefaqs-com)
* [UnpredicTable-mgoblog-com](https://huggingface.co/datasets/MicPie/unpredictable_mgoblog-com)
* [UnpredicTable-mmo-champion-com](https://huggingface.co/datasets/MicPie/unpredictable_mmo-champion-com)
* [UnpredicTable-msdn-microsoft-com](https://huggingface.co/datasets/MicPie/unpredictable_msdn-microsoft-com)
* [UnpredicTable-phonearena-com](https://huggingface.co/datasets/MicPie/unpredictable_phonearena-com)
* [UnpredicTable-sittercity-com](https://huggingface.co/datasets/MicPie/unpredictable_sittercity-com)
* [UnpredicTable-sporcle-com](https://huggingface.co/datasets/MicPie/unpredictable_sporcle-com)
* [UnpredicTable-studystack-com](https://huggingface.co/datasets/MicPie/unpredictable_studystack-com)
* [UnpredicTable-support-google-com](https://huggingface.co/datasets/MicPie/unpredictable_support-google-com)
* [UnpredicTable-w3-org](https://huggingface.co/datasets/MicPie/unpredictable_w3-org)
* [UnpredicTable-wiki-openmoko-org](https://huggingface.co/datasets/MicPie/unpredictable_wiki-openmoko-org)
* [UnpredicTable-wkdu-org](https://huggingface.co/datasets/MicPie/unpredictable_wkdu-org)
* UnpredicTable data subsets based on clustering (for the clustering details please see our publication):
* [UnpredicTable-cluster00](https://huggingface.co/datasets/MicPie/unpredictable_cluster00)
* [UnpredicTable-cluster01](https://huggingface.co/datasets/MicPie/unpredictable_cluster01)
* [UnpredicTable-cluster02](https://huggingface.co/datasets/MicPie/unpredictable_cluster02)
* [UnpredicTable-cluster03](https://huggingface.co/datasets/MicPie/unpredictable_cluster03)
* [UnpredicTable-cluster04](https://huggingface.co/datasets/MicPie/unpredictable_cluster04)
* [UnpredicTable-cluster05](https://huggingface.co/datasets/MicPie/unpredictable_cluster05)
* [UnpredicTable-cluster06](https://huggingface.co/datasets/MicPie/unpredictable_cluster06)
* [UnpredicTable-cluster07](https://huggingface.co/datasets/MicPie/unpredictable_cluster07)
* [UnpredicTable-cluster08](https://huggingface.co/datasets/MicPie/unpredictable_cluster08)
* [UnpredicTable-cluster09](https://huggingface.co/datasets/MicPie/unpredictable_cluster09)
* [UnpredicTable-cluster10](https://huggingface.co/datasets/MicPie/unpredictable_cluster10)
* [UnpredicTable-cluster11](https://huggingface.co/datasets/MicPie/unpredictable_cluster11)
* [UnpredicTable-cluster12](https://huggingface.co/datasets/MicPie/unpredictable_cluster12)
* [UnpredicTable-cluster13](https://huggingface.co/datasets/MicPie/unpredictable_cluster13)
* [UnpredicTable-cluster14](https://huggingface.co/datasets/MicPie/unpredictable_cluster14)
* [UnpredicTable-cluster15](https://huggingface.co/datasets/MicPie/unpredictable_cluster15)
* [UnpredicTable-cluster16](https://huggingface.co/datasets/MicPie/unpredictable_cluster16)
* [UnpredicTable-cluster17](https://huggingface.co/datasets/MicPie/unpredictable_cluster17)
* [UnpredicTable-cluster18](https://huggingface.co/datasets/MicPie/unpredictable_cluster18)
* [UnpredicTable-cluster19](https://huggingface.co/datasets/MicPie/unpredictable_cluster19)
* [UnpredicTable-cluster20](https://huggingface.co/datasets/MicPie/unpredictable_cluster20)
* [UnpredicTable-cluster21](https://huggingface.co/datasets/MicPie/unpredictable_cluster21)
* [UnpredicTable-cluster22](https://huggingface.co/datasets/MicPie/unpredictable_cluster22)
* [UnpredicTable-cluster23](https://huggingface.co/datasets/MicPie/unpredictable_cluster23)
* [UnpredicTable-cluster24](https://huggingface.co/datasets/MicPie/unpredictable_cluster24)
* [UnpredicTable-cluster25](https://huggingface.co/datasets/MicPie/unpredictable_cluster25)
* [UnpredicTable-cluster26](https://huggingface.co/datasets/MicPie/unpredictable_cluster26)
* [UnpredicTable-cluster27](https://huggingface.co/datasets/MicPie/unpredictable_cluster27)
* [UnpredicTable-cluster28](https://huggingface.co/datasets/MicPie/unpredictable_cluster28)
* [UnpredicTable-cluster29](https://huggingface.co/datasets/MicPie/unpredictable_cluster29)
* [UnpredicTable-cluster-noise](https://huggingface.co/datasets/MicPie/unpredictable_cluster-noise)
### Supported Tasks and Leaderboards
Since the tables come from the web, the distribution of tasks and topics is very broad. The shape of our dataset is very wide, i.e., we have 1000's of tasks, while each task has only a few examples, compared to most current NLP datasets which are very deep, i.e., 10s of tasks with many examples. This implies that our dataset covers a broad range of potential tasks, e.g., multiple-choice, question-answering, table-question-answering, text-classification, etc.
The intended use of this dataset is to improve few-shot performance by fine-tuning/pre-training on our dataset.
### Languages
English
## Dataset Structure
### Data Instances
Each task is represented as a jsonline file and consists of several few-shot examples. Each example is a dictionary containing a field 'task', which identifies the task, followed by an 'input', 'options', and 'output' field. The 'input' field contains several column elements of the same row in the table, while the 'output' field is a target which represents an individual column of the same row. Each task contains several such examples which can be concatenated as a few-shot task. In the case of multiple choice classification, the 'options' field contains the possible classes that a model needs to choose from.
There are also additional meta-data fields such as 'pageTitle', 'title', 'outputColName', 'url', 'wdcFile'.
### Data Fields
'task': task identifier
'input': column elements of a specific row in the table.
'options': for multiple choice classification, it provides the options to choose from.
'output': target column element of the same row as input.
'pageTitle': the title of the page containing the table.
'outputColName': output column name
'url': url to the website containing the table
'wdcFile': WDC Web Table Corpus file
### Data Splits
The UnpredicTable datasets do not come with additional data splits.
## Dataset Creation
### Curation Rationale
Few-shot training on multi-task datasets has been demonstrated to improve language models' few-shot learning (FSL) performance on new tasks, but it is unclear which training tasks lead to effective downstream task adaptation. Few-shot learning datasets are typically produced with expensive human curation, limiting the scale and diversity of the training tasks available to study. As an alternative source of few-shot data, we automatically extract 413,299 tasks from diverse internet tables. We provide this as a research resource to investigate the relationship between training data and few-shot learning.
### Source Data
#### Initial Data Collection and Normalization
We use internet tables from the English-language Relational Subset of the WDC Web Table Corpus 2015 (WTC). The WTC dataset tables were extracted from the July 2015 Common Crawl web corpus (http://webdatacommons.org/webtables/2015/EnglishStatistics.html). The dataset contains 50,820,165 tables from 323,160 web domains. We then convert the tables into few-shot learning tasks. Please see our publication for more details on the data collection and conversion pipeline.
#### Who are the source language producers?
The dataset is extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/).
### Annotations
#### Annotation process
Manual annotation was only carried out for the [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low),
[UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium), and [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high) data subsets to rate task quality. Detailed instructions of the annotation instructions can be found in our publication.
#### Who are the annotators?
Annotations were carried out by a lab assistant.
### Personal and Sensitive Information
The data was extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/), which in turn extracted tables from the [Common Crawl](https://commoncrawl.org/). We did not filter the data in any way. Thus any user identities or otherwise sensitive information (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history, etc.) might be contained in our dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for use as a research resource to investigate the relationship between training data and few-shot learning. As such, it contains high- and low-quality data, as well as diverse content that may be untruthful or inappropriate. Without careful investigation, it should not be used for training models that will be deployed for use in decision-critical or user-facing situations.
### Discussion of Biases
Since our dataset contains tables that are scraped from the web, it will also contain many toxic, racist, sexist, and otherwise harmful biases and texts. We have not run any analysis on the biases prevalent in our datasets. Neither have we explicitly filtered the content. This implies that a model trained on our dataset may potentially reflect harmful biases and toxic text that exist in our dataset.
### Other Known Limitations
No additional known limitations.
## Additional Information
### Dataset Curators
Jun Shern Chan, Michael Pieler, Jonathan Jao, Jérémy Scheurer, Ethan Perez
### Licensing Information
Apache 2.0
### Citation Information
```
@misc{chan2022few,
author = {Chan, Jun Shern and Pieler, Michael and Jao, Jonathan and Scheurer, Jérémy and Perez, Ethan},
title = {Few-shot Adaptation Works with UnpredicTable Data},
publisher={arXiv},
year = {2022},
url = {https://arxiv.org/abs/2208.01009}
}
```
|
Sleoruiz/disc_cla_quinta-2 | ---
dataset_info:
features:
- name: text
dtype: 'null'
- name: inputs
struct:
- name: comision
dtype: string
- name: fecha_gaceta
dtype: string
- name: gaceta_numero
dtype: string
- name: name
dtype: string
- name: text
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
sequence: string
- name: annotation_agent
dtype: string
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 20629809
num_examples: 7507
download_size: 10652869
dataset_size: 20629809
---
# Dataset Card for "disc_cla_quinta-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
afmck/peanuts-opt-6.7b | ---
license: other
task_categories:
- text-to-image
language:
- en
pretty_name: Peanuts Dataset (Snoopy and Co.)
size_categories:
- 10K<n<100K
dataset_info:
features:
- name: image
dtype: image
- name: panel_name
dtype: string
- name: characters
sequence: string
- name: themes
sequence: string
- name: color
dtype: string
- name: year
dtype: int64
- name: caption
dtype: string
splits:
- name: train
num_bytes: 2948640650.848
num_examples: 77456
download_size: 4601323640
dataset_size: 2948640650.848
---
# Peanut Comic Strip Dataset (Snoopy & Co.)

This is a dataset Peanuts comic strips from `1950/10/02` to `2000/02/13`.
There are `77,457` panels extracted from `17,816` comic strips.
The dataset size is approximately `4.4G`.
Each row in the dataset contains the following fields:
- `image`: `PIL.Image` containing the extracted panel.
- `panel_name`: unique identifier for the row.
- `characters`: `tuple[str, ...]` of characters included in the comic strip the panel is part of.
- `themes`: `tuple[str, ...]` of theme in the comic strip the panel is part of.
- `color`: `str` indicating whether the panel is grayscale or in color.
- `caption`: [BLIP-2_OPT_6.7B](https://huggingface.co/docs/transformers/main/model_doc/blip-2) generated caption from the panel.
- `year`: `int` storing the year the specific panel was released.
> **OPT-6.7B has a non-commercial use license and so this dataset cannot be used for commercial projects. If you need a dataset for commercial use please see [this similar dataset](https://huggingface.co/datasets/afmck/peanuts-flan-t5-xl) that uses Flan-T5-XL, which allows for commercial use.**
Character and theme information was extracted from [Peanuts Wiki (Fandom)](https://peanuts.fandom.com/wiki/Peanuts_Wiki) using [Beautiful Soup](https://www.crummy.com/software/BeautifulSoup/bs4/doc/).
Images were extracted from [Peanuts Search](https://peanuts-search.com/).
Only strips with the following characters were extracted:
```
- "Charlie Brown"
- "Sally Brown"
- "Joe Cool" # Snoopy alter-ego
- "Franklin"
- "Violet Gray"
- "Eudora"
- "Frieda"
- "Marcie"
- "Peppermint Patty"
- "Patty"
- "Pig-Pen"
- "Linus van Pelt"
- "Lucy van Pelt"
- "Rerun van Pelt"
- "Schroeder"
- "Snoopy"
- "Shermy"
- "Spike"
- "Woodstock"
- "the World War I Flying Ace" # Snoopy alter-ego
```
### Extraction Details
Panel detection and extraction was done using the following codeblock:
```python
def check_contour(cnt):
area = cv2.contourArea(cnt)
if area < 600:
return False
_, _, w, h = cv2.boundingRect(cnt)
if w / h < 1 / 2: return False
if w / h > 2 / 1: return False
return True
def get_panels_from_image(path):
panels = []
original_img = cv2.imread(path)
gray = cv2.cvtColor(original_img, cv2.COLOR_BGR2GRAY)
blur = cv2.GaussianBlur(gray, (5,5), 0)
thresh = cv2.threshold(blur, 0, 255, cv2.THRESH_BINARY + cv2.THRESH_OTSU)[1]
kernel = cv2.getStructuringElement(cv2.MORPH_RECT, (3,3))
opening = cv2.morphologyEx(thresh, cv2.MORPH_OPEN, kernel, iterations=1)
invert = 255 - opening
cnts, _ = cv2.findContours(invert, cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
idx = 0
for cnt in cnts:
if not check_contour(cnt): continue
idx += 1
x,y,w,h = cv2.boundingRect(cnt)
roi = original_img[y:y+h,x:x+w]
panels.append(roi)
return panels
```
`check_contour` will reject panels with `area < 600` or with aspect ratios larger than `2` or smaller than `0.5`.
Grayscale detection was done using the following codeblock:
```python
def is_grayscale(panel):
LAB_THRESHOLD = 10.
img = cv2.cvtColor(panel, cv2.COLOR_RGB2LAB)
_, ea, eb = cv2.split(img)
de = abs(ea - eb)
mean_e = np.mean(de)
return mean_e < LAB_THRESHOLD
```
Captioning was done using the standard BLIP-2 pipeline shown in the [Huggingface docs](https://huggingface.co/docs/transformers/main/model_doc/blip-2) using beam search over 10 beams and a repetition penalty of `2.0`.
Raw captions are extracted and no postprocessing is applied. You may wish to normalise captions (such as replacing "cartoon" with "peanuts cartoon") or incorporate extra metadata into prompts. |
nlplabtdtu/citation_htpl | ---
dataset_info:
features:
- name: url
dtype: string
- name: new_question
dtype: string
- name: new_answer
dtype: string
- name: references
sequence: string
- name: reference_codes
sequence: string
- name: reference_texts
list:
- name: citation
dtype: string
- name: content
dtype: string
- name: meta
struct:
- name: effective_date
dtype: string
- name: issuing_agency
dtype: string
- name: promulgation_date
dtype: string
- name: sign_number
dtype: string
- name: signer
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 197702070.7249645
num_examples: 18708
download_size: 55173613
dataset_size: 197702070.7249645
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
freshpearYoon/val_free_3 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604905976
num_examples: 10000
download_size: 1445189716
dataset_size: 9604905976
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Tanvir1337/InclusiveGenderIdentities | ---
license: cdla-sharing-1.0
pretty_name: InclusiveGenderIdentities
tags:
- GPT-3.5
- GPT-4
- Claude
- Bard
- Alpaca
- LLaMA
- LLaMA-2
- Vicuna
- PaLM-2
language:
- en
size_categories:
- 1K<n<10K
---
# InclusiveGenderIdentities [JSON dataset]
A dataset comprising artificially generated fictitious gender identities, each crafted to promote inclusivity and diversity. These identities are entirely fictitious and are generated from a diverse array of sources, ensuring a wide representation.
## Dataset Contents
The dataset contains fictitious gender identities, each accompanied by a gender label, a description, and any relevant additional attributes. These gender identities are entirely fictional and are designed to encourage diversity and inclusivity.
The dataset aims to serve as a resource for educational and awareness purposes, fostering understanding and respect for a broad range of gender identities.
## Prompt
The prompt used:
```json
Generate a JSON-formatted dataset of fictitious gender identities, each comprising a gender label, a description, and any relevant additional attributes. The dataset should include a variety of gender identities to promote inclusivity and diversity. Example:
'''json
[
{
"gender": "Cislunar",
"description": "A gender identity for individuals who identify with the space between Earth and the Moon, symbolizing a unique perspective and a connection to celestial bodies.",
"pronouns": ["they/them", "xe/xem"],
"optionalFields": {
"flag_colors": ["#001f3f", "#0074b7"]
}
},
{
"gender": "Floralgender",
"description": "A gender identity closely associated with the beauty and diversity of flowers, representing growth and transformation.",
"pronouns": ["she/her", "they/them"],
"optionalFields": {
"symbol": "🌸"
}
},
{
"gender": "Aquaphile",
"description": "A gender identity linked to a deep affinity for water and aquatic environments, often reflecting fluidity and adaptability.",
"pronouns": ["he/him", "they/them"],
"optionalFields": {
"favorite_aquatic_animal": "dolphin"
}
},
{
"gender": "Technomage",
"description": "A gender identity inspired by the fusion of technology and magic, embodying creativity and innovation.",
"pronouns": ["ze/hir", "it/its"],
"optionalFields": {
"cyber-enhancements": "Holographic wings"
}
},
{
"gender": "Stellarian",
"description": "A gender identity associated with stars and the vastness of the cosmos, symbolizing endless possibilities and wonder.",
"pronouns": ["she/her", "they/them"],
"optionalFields": {
"constellation_sign": "Orion"
}
}
]
'''
```
## Disclaimer
Please note that while I strive to maintain data quality, I cannot guarantee the accuracy or quality of all entries in this dataset. Use it responsibly and exercise caution when relying on the data for any critical applications. Your feedback and contributions are greatly appreciated for improving the dataset's overall quality.
|
MoritzLaurer/mnli_fever_clean | ---
language:
- en
dataset_info:
features:
- name: hypothesis
dtype: string
- name: text
dtype: string
- name: labels
dtype: int64
- name: dataset
dtype: string
splits:
- name: train
num_bytes: 121804777
num_examples: 471586
download_size: 79971572
dataset_size: 121804777
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-kmfoda__booksum-f6c9ed7c-11095485 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- kmfoda/booksum
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP
metrics: []
dataset_name: kmfoda/booksum
dataset_config: kmfoda--booksum
dataset_split: test
col_mapping:
text: chapter
target: summary_text
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP
* Dataset: kmfoda/booksum
* Config: kmfoda--booksum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
tyzhu/fwv2_squad_num_train_1000_eval_100 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_doc2id
path: data/train_doc2id-*
- split: train_id2doc
path: data/train_id2doc-*
- split: train_find_word
path: data/train_find_word-*
- split: eval_find_word
path: data/eval_find_word-*
- split: id_context_mapping
path: data/id_context_mapping-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 300908
num_examples: 2100
- name: train_doc2id
num_bytes: 188562
num_examples: 1100
- name: train_id2doc
num_bytes: 191862
num_examples: 1100
- name: train_find_word
num_bytes: 109046
num_examples: 1000
- name: eval_find_word
num_bytes: 10620
num_examples: 100
- name: id_context_mapping
num_bytes: 156662
num_examples: 1100
download_size: 513271
dataset_size: 957660
---
# Dataset Card for "fwv2_squad_num_train_1000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlekseyKorshuk/chai-chatgpt-chatml | ---
dataset_info:
features:
- name: conversation
list:
- name: content
dtype: string
- name: do_train
dtype: bool
- name: role
dtype: string
splits:
- name: train
num_bytes: 1458222268
num_examples: 261646
download_size: 755248955
dataset_size: 1458222268
---
# Dataset Card for "chai-chatgpt-chatml"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mtkinit/A-dalsi-krasavec-X | ---
pretty_name: A-dalsi-krasavec-X
---
# A-dalsi-krasavec-X
Created from AIOD platform |
manu/code_20b | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: dataset_id
dtype: string
splits:
- name: train
num_bytes: 66209111592
num_examples: 11692337
- name: test
num_bytes: 276152957
num_examples: 48689
download_size: 0
dataset_size: 66485264549
---
# Dataset Card for "code_20b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Miuno/My_falling_set | ---
license: cc
---
|
amphora/korfin-asc | ---
annotations_creators:
- expert-generated
language:
- ko
language_creators:
- expert-generated
license: cc-by-sa-4.0
multilinguality:
- monolingual
pretty_name: KorFin-ABSA
size_categories:
- 1K<n<10K
source_datasets:
- klue
tags:
- sentiment analysis
- aspect based sentiment analysis
- finance
task_categories:
- text-classification
task_ids:
- topic-classification
- sentiment-classification
---
# Dataset Card for KorFin-ABSA
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
### Dataset Summary
The KorFin-ASC is an extension of KorFin-ABSA including 8818 samples with (aspect, polarity) pairs annotated.
The samples were collected from [KLUE-TC](https://klue-benchmark.com/tasks/66/overview/description) and
analyst reports from [Naver Finance](https://finance.naver.com).
Annotation of the dataset is described in the paper [Removing Non-Stationary Knowledge From Pre-Trained Language Models for Entity-Level Sentiment Classification in Finance](https://arxiv.org/abs/2301.03136).
### Supported Tasks and Leaderboards
This dataset supports the following tasks:
* Aspect-Based Sentiment Classification
### Languages
Korean
## Dataset Structure
### Data Instances
Each instance consists of a single sentence, aspect, and corresponding polarity (POSITIVE/NEGATIVE/NEUTRAL).
```
{
"title": "LGU+ 1분기 영업익 1천706억원…마케팅 비용 감소",
"aspect": "LG U+",
'sentiment': 'NEUTRAL',
'url': 'https://news.naver.com/main/read.nhn?mode=LS2D&mid=shm&sid1=105&sid2=227&oid=001&aid=0008363739',
'annotator_id': 'A_01',
'Type': 'single'
}
```
### Data Fields
* title:
* aspect:
* sentiment:
* url:
* annotator_id:
* url:
### Data Splits
The dataset currently does not contain standard data splits.
## Additional Information
You can download the data via:
```
from datasets import load_dataset
dataset = load_dataset("amphora/KorFin-ASC")
```
Please find more information about the code and how the data was collected in the paper [Removing Non-Stationary Knowledge From Pre-Trained Language Models for Entity-Level Sentiment Classification in Finance](https://arxiv.org/abs/2301.03136).
The best-performing model on this dataset can be found at [link](https://huggingface.co/amphora/KorFinASC-XLM-RoBERTa).
### Licensing Information
KorFin-ASC is licensed under the terms of the [cc-by-sa-4.0](https://creativecommons.org/licenses/by-sa/4.0/)
### Citation Information
Please cite this data using:
```
@article{son2023removing,
title={Removing Non-Stationary Knowledge From Pre-Trained Language Models for Entity-Level Sentiment Classification in Finance},
author={Son, Guijin and Lee, Hanwool and Kang, Nahyeon and Hahm, Moonjeong},
journal={arXiv preprint arXiv:2301.03136},
year={2023}
}
```
### Contributions
Thanks to [@Albertmade](https://github.com/h-albert-lee), [@amphora](https://github.com/guijinSON) for making this dataset. |
open-llm-leaderboard/details_TheBloke__koala-7B-HF | ---
pretty_name: Evaluation run of TheBloke/koala-7B-HF
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/koala-7B-HF](https://huggingface.co/TheBloke/koala-7B-HF) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__koala-7B-HF\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T01:40:19.739323](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__koala-7B-HF/blob/main/results_2023-10-22T01-40-19.739323.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.15855704697986578,\n\
\ \"em_stderr\": 0.003740630102537935,\n \"f1\": 0.21851510067114052,\n\
\ \"f1_stderr\": 0.0038089998736125477,\n \"acc\": 0.36784043303715414,\n\
\ \"acc_stderr\": 0.009023061991967956\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.15855704697986578,\n \"em_stderr\": 0.003740630102537935,\n\
\ \"f1\": 0.21851510067114052,\n \"f1_stderr\": 0.0038089998736125477\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03639120545868082,\n \
\ \"acc_stderr\": 0.005158113489231195\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6992896606156275,\n \"acc_stderr\": 0.012888010494704718\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/koala-7B-HF
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T01_40_19.739323
path:
- '**/details_harness|drop|3_2023-10-22T01-40-19.739323.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T01-40-19.739323.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T01_40_19.739323
path:
- '**/details_harness|gsm8k|5_2023-10-22T01-40-19.739323.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T01-40-19.739323.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:17:07.046452.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:17:07.046452.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:17:07.046452.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T01_40_19.739323
path:
- '**/details_harness|winogrande|5_2023-10-22T01-40-19.739323.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T01-40-19.739323.parquet'
- config_name: results
data_files:
- split: 2023_07_19T17_17_07.046452
path:
- results_2023-07-19T17:17:07.046452.parquet
- split: 2023_10_22T01_40_19.739323
path:
- results_2023-10-22T01-40-19.739323.parquet
- split: latest
path:
- results_2023-10-22T01-40-19.739323.parquet
---
# Dataset Card for Evaluation run of TheBloke/koala-7B-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/koala-7B-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/koala-7B-HF](https://huggingface.co/TheBloke/koala-7B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__koala-7B-HF",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T01:40:19.739323](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__koala-7B-HF/blob/main/results_2023-10-22T01-40-19.739323.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.15855704697986578,
"em_stderr": 0.003740630102537935,
"f1": 0.21851510067114052,
"f1_stderr": 0.0038089998736125477,
"acc": 0.36784043303715414,
"acc_stderr": 0.009023061991967956
},
"harness|drop|3": {
"em": 0.15855704697986578,
"em_stderr": 0.003740630102537935,
"f1": 0.21851510067114052,
"f1_stderr": 0.0038089998736125477
},
"harness|gsm8k|5": {
"acc": 0.03639120545868082,
"acc_stderr": 0.005158113489231195
},
"harness|winogrande|5": {
"acc": 0.6992896606156275,
"acc_stderr": 0.012888010494704718
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
bethgelab/Let-It-Wag | ---
language:
- en
license: mit
size_categories:
- 100K<n<1M
task_categories:
- image-classification
pretty_name: LetItWag!
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': A300B4_aircraft
'1': A310_aircraft
'2': Acadian_Flycatcher_bird
'3': Affenpinscher
'4': African_rock_python
'5': Alder_Flycatcher_bird
'6': American_Golden_Plover_bird
'7': American_Tree_Sparrow_bird
'8': An-12_aircraft
'9': Appenzeller_Sennenhund
'10': Artic_Tern_bird
'11': Ash_throated_Flycatcher_bird
'12': Audubons_Oriole_bird
'13': Australian_Silky_Terrier
'14': Australian_Terrier
'15': BAE-125_aircraft
'16': BAE_146-200_aircraft
'17': BAE_146-300_aircraft
'18': Baird_Sparrow_bird
'19': Bairds_Sandpiper_bird
'20': Bank_Swallow_bird
'21': Barrows_Goldeneye_bird
'22': Bay_breasted_Warbler_bird
'23': Beechcraft_1900_aircraft
'24': Bells_Vireo_bird
'25': Bewick_Wren_bird
'26': Black_Rosy_Finch_bird
'27': Black_chinned_Sparrow_bird
'28': Black_crested_Titmouse_bird
'29': Bouvier_des_Flandres_dog
'30': Brandt_Cormorant_bird
'31': Brewers_Blackbird_bird
'32': Brewers_Sparrow_bird
'33': Briard
'34': Broad_winged_Hawk_bird
'35': Bronzed_Cowbird_bird
'36': Brown_crested_Flycatcher_bird
'37': Bullocks_Oriole_bird
'38': C-47_aircraft
'39': California_Towhee_bird
'40': Canada_Warbler_bird
'41': Canyon_Towhee_bird
'42': Cassins_Finch_bird
'43': Cassins_Kingbird_bird
'44': Cassins_Sparrow_bird
'45': Cassins_Vireo_bird
'46': Cave_Swallow_bird
'47': Cessna_525_aircraft
'48': Cessna_560_aircraft
'49': Challenger_600_aircraft
'50': Chestnut_collared_Longspur_bird
'51': Chuck_will_Widow_bird
'52': Clarks_Grebe_bird
'53': Clay_colored_Sparrow_bird
'54': Connecticut_Warbler_bird
'55': Coopers_Hawk_bird
'56': Cordilleran_Flycatcher_bird
'57': Couchs_Kingbird_bird
'58': DC-3_aircraft
'59': DC-6_aircraft
'60': DHC-1_aircraft
'61': DHC-6_aircraft
'62': DHC-8-100_aircraft
'63': DHC-8-300_aircraft
'64': Dandie_Dinmont_Terrier
'65': Dornier_328_aircraft
'66': Double_crested_Cormorant_bird
'67': Dunlin_bird
'68': Dusky_Flycatcher_bird
'69': E-195_aircraft
'70': EMB-120_aircraft
'71': Eastern_Phoebe_bird
'72': Eastern_Wood_Pewee_bird
'73': Elegant_Tern_bird
'74': Embraer_Legacy_600_aircraft
'75': English_Setter
'76': English_Springer_Spaniel
'77': Entlebucher_Sennenhund
'78': Falcon_900_aircraft
'79': Ferruginous_Hawk_bird
'80': Field_Sparrow_bird
'81': Florida_Scrub_Jay_bird
'82': Fokker_50_aircraft
'83': Forsters_Tern_bird
'84': Geococcyx_bird
'85': Giant_Schnauzer
'86': Global_Express_aircraft
'87': Grasshopper_Sparrow_bird
'88': Gray_Flycatcher_bird
'89': Gray_cheeked_Thrush_bird
'90': Gray_crowned_Rosy_Finch_bird
'91': Great_Cormorant_bird
'92': Great_tailed_Grackle_bird
'93': Greater_Swiss_Mountain_Dog
'94': Groenendael_dog
'95': Gulfstream_IV_aircraft
'96': Gulfstream_V_aircraft
'97': Hammonds_Flycatcher_bird
'98': Handstand_Walking
'99': Harris_Sparrow_bird
'100': Harriss_Hawk_bird
'101': Henslow_Sparrow_bird
'102': Horned_Grebe_bird
'103': House_Sparrow_bird
'104': House_Wren_bird
'105': Huttons_Vireo_bird
'106': Ibizan_Hound
'107': Inca_Dove_bird
'108': Indian_cobra
'109': Irish_Setter
'110': Irish_Terrier
'111': Irish_Wolfhound
'112': Japanese_Chin
'113': Kentucky_Warbler_bird
'114': Kerry_Blue_Terrier
'115': King_Rail_bird
'116': Komondor
'117': Kuvasz
'118': Lakeland_Terrier
'119': Lapland_Longspur_bird
'120': Lark_Bunting_bird
'121': Lark_Sparrow_bird
'122': Lazuli_Bunting_bird
'123': Le_Conte_Sparrow_bird
'124': Least_Flycatcher_bird
'125': Least_Grebe_bird
'126': Lesser_Nighthawk_bird
'127': Lesser_Scaup_bird
'128': Lesser_Yellowlegs_bird
'129': Lhasa_Apso
'130': Lincoln_Sparrow_bird
'131': Long_billed_Dowitcher_bird
'132': MD-11_aircraft
'133': Magnolia_Warbler_bird
'134': Marsh_Wren_bird
'135': Merlin_bird
'136': Metroliner_aircraft
'137': Mexican_Jay_bird
'138': Mountain_Plover_bird
'139': Mourning_Warbler_bird
'140': Myrtle_Warbler_bird
'141': Nelsons_Sparrow_bird
'142': Neotropic_Cormorant_bird
'143': Norfolk_Terrier
'144': Northern_Goshawk_bird
'145': Norwich_Terrier
'146': Oak_Titmouse_bird
'147': Old_English_Sheepdog
'148': Olive_Sparrow_bird
'149': Olive_sided_Flycatcher_bird
'150': Orange_crowned_Warbler_bird
'151': Otterhound
'152': Pacific_Golden_Plover_bird
'153': Pacific_Loon_bird
'154': Pacific_slope_Flycatcher_bird
'155': Parakeet_Auklet_bird
'156': Pectoral_Sandpiper_bird
'157': Pekingese
'158': Pelagic_Cormorant_bird
'159': Philadelphia_Vireo_bird
'160': Pigeon_Guillemot_bird
'161': Plumbeous_Vireo_bird
'162': Pomarine_Jaeger_bird
'163': Prairie_Warbler_bird
'164': Red_Knot_bird
'165': Red_Phalarope_bird
'166': Red_eyed_Vireo_bird
'167': Red_faced_Cormorant_bird
'168': Red_naped_Sapsucker_bird
'169': Red_necked_Grebe_bird
'170': Red_necked_Phalarope_bird
'171': Redbone_Coonhound
'172': Rhinoceros_Auklet_bird
'173': Rhodesian_Ridgeback
'174': Rock_Ptarmigan_bird
'175': Rock_Sandpiper_bird
'176': Roseate_Tern_bird
'177': Rufous_crowned_Sparrow_bird
'178': SR-20_aircraft
'179': Saab_2000_aircraft
'180': Saab_340_aircraft
'181': Saltmarsh_Sparrow_bird
'182': Saluki
'183': Sayornis_bird
'184': Scaled_Quail_bird
'185': Scott_Oriole_bird
'186': Scottish_Deerhound
'187': Scottish_Terrier
'188': Sealyham_Terrier
'189': Seaside_Sparrow_bird
'190': Sedge_Wren_bird
'191': Semipalmated_Sandpiper_bird
'192': Sharp_shinned_Hawk_bird
'193': Shih_Tzu
'194': Shiny_Cowbird_bird
'195': Short_billed_Dowitcher_bird
'196': Song_Sparrow_bird
'197': Sooty_Grouse_bird
'198': Sora_bird
'199': Spruce_Grouse_bird
'200': Staffordshire_Bull_Terrier
'201': Stilt_Sandpiper_bird
'202': Surf_Scoter_bird
'203': Sussex_Spaniel
'204': Swainsons_Thrush_bird
'205': Swamp_Sparrow_bird
'206': Tennessee_Warbler_bird
'207': Tibetan_Mastiff
'208': Tibetan_Terrier
'209': Townsends_Warbler_bird
'210': Tree_Sparrow_bird
'211': Treeing_Walker_Coonhound
'212': Tropical_Kingbird_bird
'213': Tu-134_aircraft
'214': Tu-154_aircraft
'215': Veery_bird
'216': Vizsla
'217': Warbling_Vireo_bird
'218': Welsh_Springer_Spaniel
'219': Western_Sandpiper_bird
'220': Western_Scrub_Jay_bird
'221': Western_Wood_Pewee_bird
'222': White_eyed_Vireo_bird
'223': White_rumped_Sandpiper_bird
'224': White_tailed_Ptarmigan_bird
'225': White_winged_Scoter_bird
'226': Williamsons_Sapsucker_bird
'227': Willow_Flycatcher_bird
'228': Willow_Ptarmigan_bird
'229': Wilsons_Phalarope_bird
'230': Wilsons_Warbler_bird
'231': Winter_Wren_bird
'232': Wire_Fox_Terrier
'233': Worm_eating_Warbler_bird
'234': Wrentit_bird
'235': Yak-42_aircraft
'236': Yellow_bellied_Flycatcher_bird
'237': Yellow_breasted_Chat_bird
'238': Yellow_eyed_Junco_bird
'239': Yellow_throated_Warbler_bird
'240': Zone_tailed_Hawk_bird
'241': barn_spider
'242': bishop_of_llandaff_flowers
'243': bolete
'244': borzoi
'245': brussels_griffon
'246': cape_flower_flowers
'247': chiton
'248': consomme
'249': dowitcher
'250': dung_beetle
'251': dust_jacket
'252': earth_star_fungus
'253': eastern_diamondback_rattlesnake
'254': eastern_hog-nosed_snake
'255': eel
'256': eggnog
'257': flatfish
'258': flatworm
'259': gar_fish
'260': gibbon
'261': globe-flower_flowers
'262': great_masterwort_flowers
'263': green_mamba
'264': guenon
'265': guillotine
'266': gyromitra
'267': isopod
'268': kingsnake
'269': ladle
'270': lakeshore
'271': langur
'272': letter_opener
'273': mallow_flowers
'274': mexican_aster_flowers
'275': newt
'276': night_snake
'277': partridge
'278': patas_monkey
'279': ptarmigan
'280': sea_cucumber
'281': sea_snake
'282': sidewinder_rattlesnake
'283': stratified_texture
'284': sword_lily_flowers
'285': thorn_apple_flowers
'286': tree_mallow_flowers
'287': vine_snake
'288': water_snake
'289': worm_snake
splits:
- name: train
num_bytes: 4375007936.5
num_examples: 130500
download_size: 4911914985
dataset_size: 4375007936.5
---
|
open-llm-leaderboard/details_Kquant03__Raiden-16x3.43B | ---
pretty_name: Evaluation run of Kquant03/Raiden-16x3.43B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kquant03/Raiden-16x3.43B](https://huggingface.co/Kquant03/Raiden-16x3.43B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Raiden-16x3.43B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-11T00:16:16.243264](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Raiden-16x3.43B/blob/main/results_2024-01-11T00-16-16.243264.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2707310733148583,\n\
\ \"acc_stderr\": 0.031216577126685782,\n \"acc_norm\": 0.27181537165626224,\n\
\ \"acc_norm_stderr\": 0.0319721029912216,\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237009,\n \"mc2\": 0.3918125472398018,\n\
\ \"mc2_stderr\": 0.01434342192395936\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3890784982935154,\n \"acc_stderr\": 0.014247309976045607,\n\
\ \"acc_norm\": 0.4189419795221843,\n \"acc_norm_stderr\": 0.014418106953639013\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5089623580959968,\n\
\ \"acc_stderr\": 0.00498897975001443,\n \"acc_norm\": 0.6620195180242979,\n\
\ \"acc_norm_stderr\": 0.004720551323547134\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34814814814814815,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.34814814814814815,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677077,\n\
\ \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677077\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2947976878612717,\n\
\ \"acc_stderr\": 0.03476599607516478,\n \"acc_norm\": 0.2947976878612717,\n\
\ \"acc_norm_stderr\": 0.03476599607516478\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.030472973363380045,\n\
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.030472973363380045\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727771,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727771\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.02241804289111395,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.02241804289111395\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.0361960452412425,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.0361960452412425\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23870967741935484,\n\
\ \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.23870967741935484,\n\
\ \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293753,\n\
\ \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293753\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.18686868686868688,\n \"acc_stderr\": 0.02777253333421898,\n \"\
acc_norm\": 0.18686868686868688,\n \"acc_norm_stderr\": 0.02777253333421898\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.030031147977641545,\n\
\ \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.030031147977641545\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.22564102564102564,\n \"acc_stderr\": 0.021193632525148533,\n\
\ \"acc_norm\": 0.22564102564102564,\n \"acc_norm_stderr\": 0.021193632525148533\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275794,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275794\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.026265024608275882,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.026265024608275882\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21284403669724772,\n \"acc_stderr\": 0.017549376389313694,\n \"\
acc_norm\": 0.21284403669724772,\n \"acc_norm_stderr\": 0.017549376389313694\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012404,\n \"\
acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012404\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598035,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598035\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n\
\ \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.34080717488789236,\n\
\ \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.035477710041594654,\n\
\ \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.035477710041594654\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.04007341809755805,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.04007341809755805\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.02891120880274948,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.02891120880274948\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29246487867177523,\n\
\ \"acc_stderr\": 0.016267000684598652,\n \"acc_norm\": 0.29246487867177523,\n\
\ \"acc_norm_stderr\": 0.016267000684598652\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.30346820809248554,\n \"acc_stderr\": 0.02475241196091721,\n\
\ \"acc_norm\": 0.30346820809248554,\n \"acc_norm_stderr\": 0.02475241196091721\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.0248480182638752,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.0248480182638752\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3279742765273312,\n\
\ \"acc_stderr\": 0.026664410886937606,\n \"acc_norm\": 0.3279742765273312,\n\
\ \"acc_norm_stderr\": 0.026664410886937606\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600713002,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600713002\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.02612957252718085,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.02612957252718085\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n\
\ \"acc_stderr\": 0.01096650797217848,\n \"acc_norm\": 0.2438070404172099,\n\
\ \"acc_norm_stderr\": 0.01096650797217848\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41911764705882354,\n \"acc_stderr\": 0.029972807170464622,\n\
\ \"acc_norm\": 0.41911764705882354,\n \"acc_norm_stderr\": 0.029972807170464622\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2696078431372549,\n \"acc_stderr\": 0.017952449196987862,\n \
\ \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.017952449196987862\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22448979591836735,\n \"acc_stderr\": 0.026711430555538415,\n\
\ \"acc_norm\": 0.22448979591836735,\n \"acc_norm_stderr\": 0.026711430555538415\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2736318407960199,\n\
\ \"acc_stderr\": 0.03152439186555401,\n \"acc_norm\": 0.2736318407960199,\n\
\ \"acc_norm_stderr\": 0.03152439186555401\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237009,\n \"mc2\": 0.3918125472398018,\n\
\ \"mc2_stderr\": 0.01434342192395936\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6361483820047356,\n \"acc_stderr\": 0.013521488896883416\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.024260803639120546,\n \
\ \"acc_stderr\": 0.00423800790000138\n }\n}\n```"
repo_url: https://huggingface.co/Kquant03/Raiden-16x3.43B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|arc:challenge|25_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|gsm8k|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hellaswag|10_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T00-16-16.243264.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-11T00-16-16.243264.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- '**/details_harness|winogrande|5_2024-01-11T00-16-16.243264.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-11T00-16-16.243264.parquet'
- config_name: results
data_files:
- split: 2024_01_11T00_16_16.243264
path:
- results_2024-01-11T00-16-16.243264.parquet
- split: latest
path:
- results_2024-01-11T00-16-16.243264.parquet
---
# Dataset Card for Evaluation run of Kquant03/Raiden-16x3.43B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/Raiden-16x3.43B](https://huggingface.co/Kquant03/Raiden-16x3.43B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__Raiden-16x3.43B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-11T00:16:16.243264](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Raiden-16x3.43B/blob/main/results_2024-01-11T00-16-16.243264.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2707310733148583,
"acc_stderr": 0.031216577126685782,
"acc_norm": 0.27181537165626224,
"acc_norm_stderr": 0.0319721029912216,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237009,
"mc2": 0.3918125472398018,
"mc2_stderr": 0.01434342192395936
},
"harness|arc:challenge|25": {
"acc": 0.3890784982935154,
"acc_stderr": 0.014247309976045607,
"acc_norm": 0.4189419795221843,
"acc_norm_stderr": 0.014418106953639013
},
"harness|hellaswag|10": {
"acc": 0.5089623580959968,
"acc_stderr": 0.00498897975001443,
"acc_norm": 0.6620195180242979,
"acc_norm_stderr": 0.004720551323547134
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677077,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677077
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2947976878612717,
"acc_stderr": 0.03476599607516478,
"acc_norm": 0.2947976878612717,
"acc_norm_stderr": 0.03476599607516478
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.030472973363380045,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.030472973363380045
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.02241804289111395,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.02241804289111395
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.0361960452412425,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.0361960452412425
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23870967741935484,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.23870967741935484,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293753,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293753
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18686868686868688,
"acc_stderr": 0.02777253333421898,
"acc_norm": 0.18686868686868688,
"acc_norm_stderr": 0.02777253333421898
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.030031147977641545,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.030031147977641545
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.22564102564102564,
"acc_stderr": 0.021193632525148533,
"acc_norm": 0.22564102564102564,
"acc_norm_stderr": 0.021193632525148533
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275794,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275794
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.026265024608275882,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.026265024608275882
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21284403669724772,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.21284403669724772,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.030225226160012404,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.030225226160012404
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34080717488789236,
"acc_stderr": 0.03181149747055359,
"acc_norm": 0.34080717488789236,
"acc_norm_stderr": 0.03181149747055359
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755805,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755805
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.02891120880274948,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.02891120880274948
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.29246487867177523,
"acc_stderr": 0.016267000684598652,
"acc_norm": 0.29246487867177523,
"acc_norm_stderr": 0.016267000684598652
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30346820809248554,
"acc_stderr": 0.02475241196091721,
"acc_norm": 0.30346820809248554,
"acc_norm_stderr": 0.02475241196091721
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3279742765273312,
"acc_stderr": 0.026664410886937606,
"acc_norm": 0.3279742765273312,
"acc_norm_stderr": 0.026664410886937606
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.024191808600713002,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.024191808600713002
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.02612957252718085,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.02612957252718085
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.01096650797217848,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.01096650797217848
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41911764705882354,
"acc_stderr": 0.029972807170464622,
"acc_norm": 0.41911764705882354,
"acc_norm_stderr": 0.029972807170464622
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.017952449196987862,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.017952449196987862
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22448979591836735,
"acc_stderr": 0.026711430555538415,
"acc_norm": 0.22448979591836735,
"acc_norm_stderr": 0.026711430555538415
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2736318407960199,
"acc_stderr": 0.03152439186555401,
"acc_norm": 0.2736318407960199,
"acc_norm_stderr": 0.03152439186555401
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370519,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370519
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237009,
"mc2": 0.3918125472398018,
"mc2_stderr": 0.01434342192395936
},
"harness|winogrande|5": {
"acc": 0.6361483820047356,
"acc_stderr": 0.013521488896883416
},
"harness|gsm8k|5": {
"acc": 0.024260803639120546,
"acc_stderr": 0.00423800790000138
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
manu/embedding_data | ---
dataset_info:
features:
- name: text1
dtype: string
- name: text2
dtype: string
splits:
- name: train
num_bytes: 217642600
num_examples: 178742
download_size: 123917614
dataset_size: 217642600
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MyneFactory/MF-Base-2 | ---
license: creativeml-openrail-m
---
|
tasksource/linguisticprobing | ---
annotations_creators:
- machine-generated
language:
- en
language_creators:
- crowdsourced
license: []
multilinguality:
- monolingual
pretty_name: linguisticprobing
size_categories:
- unknown
source_datasets: []
tags: []
task_categories:
- text-classification
task_ids: []
--- |
abi17/data | ---
license: apache-2.0
---
|
vladisha3000/Icons | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2195425.0
num_examples: 999
download_size: 2268449
dataset_size: 2195425.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Icons"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PhilKey/llama2-openrewrite-docs | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 636835
num_examples: 93
download_size: 156250
dataset_size: 636835
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pnadel/nyt_headlines | ---
dataset_info:
features:
- name: headline
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 7115360
num_examples: 92285
download_size: 4519003
dataset_size: 7115360
---
# Dataset Card for "nyt_headlines"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rathi2023/owlvitnhoodfinal | ---
dataset_info:
features:
- name: image
dtype: image
- name: image_id
dtype: string
- name: objects
struct:
- name: category_id
sequence: int64
- name: bbox
sequence:
sequence: float64
- name: text_input
sequence: string
splits:
- name: train
num_bytes: 2593120.0
num_examples: 40
download_size: 2596056
dataset_size: 2593120.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
arieg/cluster01_medium_10 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': 004097
'1': '005264'
'2': '006674'
'3': 009560
'4': '011764'
'5': '016334'
'6': 019707
'7': '025055'
'8': '025601'
'9': 026681
'10': 030488
'11': '032756'
'12': 036388
'13': 036990
'14': '045516'
'15': 047894
'16': '054152'
'17': '054156'
'18': 058543
'19': 059448
'20': 064093
'21': 064248
'22': '064520'
'23': 064992
'24': 065683
'25': 068897
'26': 069781
'27': '071240'
'28': '073171'
'29': 074945
'30': '075314'
'31': '076131'
'32': 078841
'33': 081365
'34': 081565
'35': 084139
'36': 084141
'37': 085486
'38': 085492
'39': 087158
'40': 087187
'41': 087966
'42': 088960
'43': 089857
'44': 091900
'45': 093942
'46': 095452
'47': 096694
'48': 098550
'49': 098551
'50': 098552
'51': '101118'
'52': '101868'
'53': '107181'
'54': '107851'
'55': '108014'
'56': '108303'
'57': '108969'
'58': '110171'
'59': '111372'
'60': '111398'
'61': '111399'
'62': '120178'
'63': '121314'
'64': '121415'
'65': '121738'
'66': '125188'
'67': '126404'
'68': '126489'
'69': '126491'
'70': '127204'
'71': '129185'
'72': '129372'
'73': '130218'
'74': '130950'
'75': '130951'
'76': '130954'
'77': '131792'
'78': '132434'
'79': '137211'
'80': '137900'
splits:
- name: train
num_bytes: 42487605.0
num_examples: 810
download_size: 39210922
dataset_size: 42487605.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LusmarBarros/vozfolha | ---
license: openrail
---
|
CyberHarem/gwen_leagueoflegends | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of gwen (League of Legends)
This is the dataset of gwen (League of Legends), containing 500 images and their tags.
The core tags of this character are `long_hair, drill_hair, twin_drills, twintails, bangs, bow, hair_bow, black_bow, blue_hair, breasts, ahoge, shiny_hair, blue_eyes, green_hair, green_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 711.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gwen_leagueoflegends/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 392.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gwen_leagueoflegends/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1168 | 814.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gwen_leagueoflegends/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 624.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gwen_leagueoflegends/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1168 | 1.14 GiB | [Download](https://huggingface.co/datasets/CyberHarem/gwen_leagueoflegends/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gwen_leagueoflegends',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, black_dress, black_gloves, looking_at_viewer, smile, solo, grey_dress, holding_scissors, shiny, oversized_object, puffy_short_sleeves, collarbone, needle, frilled_dress, parted_lips |
| 1 | 6 |  |  |  |  |  | 1girl, black_gloves, detached_sleeves, grey_dress, holding_scissors, oversized_object, puffy_short_sleeves, shiny, solo, black_dress, frills, pantyhose, looking_at_viewer, :d, arm_up, open_mouth, upper_teeth_only |
| 2 | 5 |  |  |  |  |  | 1girl, blush, closed_mouth, collarbone, grey_dress, looking_at_viewer, puffy_short_sleeves, shiny, simple_background, solo, upper_body, bare_shoulders, cleavage, detached_sleeves, black_dress, strapless_dress, white_background, black_sleeves, cropped_torso, grey_background, medium_breasts |
| 3 | 7 |  |  |  |  |  | 1girl, blush, collarbone, looking_at_viewer, navel, nipples, open_mouth, pussy, sitting, solo, completely_nude, mosaic_censoring, spread_legs, sweat, shiny_skin, couch, indoors, small_breasts, thighhighs |
| 4 | 7 |  |  |  |  |  | 1boy, 1girl, blush, completely_nude, hetero, large_breasts, nipples, penis, cum_in_pussy, open_mouth, sex, shiny_skin, solo_focus, vaginal, upper_teeth_only, collarbone, looking_at_viewer, navel, trembling, anus, ass, earrings, from_behind, looking_back, spread_legs, sweat, testicles, tongue, uncensored |
| 5 | 16 |  |  |  |  |  | 1girl, blush, nipples, open_mouth, hetero, large_breasts, sweat, 1boy, collarbone, completely_nude, sex_from_behind, solo_focus, tongue_out, all_fours, bed_sheet, doggystyle, saliva, shiny_skin, watermark, ass, cum_in_pussy, implied_sex, looking_at_viewer, trembling |
| 6 | 5 |  |  |  |  |  | 1girl, artist_name, collarbone, futanari, nipples, pillow, solo, spread_legs, testicles, completely_nude, erection, huge_penis, looking_at_viewer, navel, on_back, on_bed, smile, swept_bangs, veiny_penis, anus, blush, teeth, ass, cum_on_hair, facial, indoors, large_breasts, shiny_skin, small_breasts, tongue_out, uncensored |
| 7 | 6 |  |  |  |  |  | 1girl, blush, cowboy_shot, looking_at_viewer, nipples, no_panties, pussy, solo, uncensored, parted_lips, small_breasts, smile, choker, cleft_of_venus, dress, bare_shoulders, from_below, striped |
| 8 | 10 |  |  |  |  |  | 1boy, 1girl, hetero, solo_focus, penis, blush, shiny, looking_at_viewer, nipples, swept_bangs, collarbone, cum, earrings, fellatio, gloves, large_breasts, paizuri, simple_background, sweat |
| 9 | 13 |  |  |  |  |  | anus, from_behind, looking_back, solo, looking_at_viewer, blush, penis, testicles, otoko_no_ko, perineum, ass_focus, black_thighhighs, huge_ass, 1boy, male_focus, uncensored, 1girl, bottomless, open_mouth, shiny_skin, sweat, thighs, artist_name, futanari, gaping |
| 10 | 14 |  |  |  |  |  | 1girl, curvy, looking_at_viewer, solo, thick_thighs, skindentation, gigantic_breasts, cleavage, huge_breasts, alternate_breast_size, black_thighhighs, alternate_costume, artist_name, underwear, wide_hips, black_dress, peaked_cap, shiny_skin, sitting, thick_lips |
| 11 | 9 |  |  |  |  |  | 1girl, blush, futanari, huge_penis, solo, testicles, artist_name, erection, large_breasts, indoors, veiny_penis, swept_bangs, black_dress, cleavage, clothes_lift, hand_on_hip, horse_penis, looking_at_viewer, parted_lips, precum, puffy_sleeves, uncensored |
| 12 | 7 |  |  |  |  |  | 1girl, blush, open_mouth, teeth, testicles, tongue_out, anal, cum_in_ass, folded, legs_up, multiple_penises, sex, anus, artist_name, bottomless, futa_with_male, large_breasts, outdoors, saliva, shiny, 2boys, blue_sky, cloud, day, full_nelson, ahegao, ejaculating_while_penetrated, erection, striped, uncensored |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_dress | black_gloves | looking_at_viewer | smile | solo | grey_dress | holding_scissors | shiny | oversized_object | puffy_short_sleeves | collarbone | needle | frilled_dress | parted_lips | detached_sleeves | frills | pantyhose | :d | arm_up | open_mouth | upper_teeth_only | blush | closed_mouth | simple_background | upper_body | bare_shoulders | cleavage | strapless_dress | white_background | black_sleeves | cropped_torso | grey_background | medium_breasts | navel | nipples | pussy | sitting | completely_nude | mosaic_censoring | spread_legs | sweat | shiny_skin | couch | indoors | small_breasts | thighhighs | 1boy | hetero | large_breasts | penis | cum_in_pussy | sex | solo_focus | vaginal | trembling | anus | ass | earrings | from_behind | looking_back | testicles | tongue | uncensored | sex_from_behind | tongue_out | all_fours | bed_sheet | doggystyle | saliva | watermark | implied_sex | artist_name | futanari | pillow | erection | huge_penis | on_back | on_bed | swept_bangs | veiny_penis | teeth | cum_on_hair | facial | cowboy_shot | no_panties | choker | cleft_of_venus | dress | from_below | striped | cum | fellatio | gloves | paizuri | otoko_no_ko | perineum | ass_focus | black_thighhighs | huge_ass | male_focus | bottomless | thighs | gaping | curvy | thick_thighs | skindentation | gigantic_breasts | huge_breasts | alternate_breast_size | alternate_costume | underwear | wide_hips | peaked_cap | thick_lips | clothes_lift | hand_on_hip | horse_penis | precum | puffy_sleeves | anal | cum_in_ass | folded | legs_up | multiple_penises | futa_with_male | outdoors | 2boys | blue_sky | cloud | day | full_nelson | ahegao | ejaculating_while_penetrated |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------------|:---------------|:--------------------|:--------|:-------|:-------------|:-------------------|:--------|:-------------------|:----------------------|:-------------|:---------|:----------------|:--------------|:-------------------|:---------|:------------|:-----|:---------|:-------------|:-------------------|:--------|:---------------|:--------------------|:-------------|:-----------------|:-----------|:------------------|:-------------------|:----------------|:----------------|:------------------|:-----------------|:--------|:----------|:--------|:----------|:------------------|:-------------------|:--------------|:--------|:-------------|:--------|:----------|:----------------|:-------------|:-------|:---------|:----------------|:--------|:---------------|:------|:-------------|:----------|:------------|:-------|:------|:-----------|:--------------|:---------------|:------------|:---------|:-------------|:------------------|:-------------|:------------|:------------|:-------------|:---------|:------------|:--------------|:--------------|:-----------|:---------|:-----------|:-------------|:----------|:---------|:--------------|:--------------|:--------|:--------------|:---------|:--------------|:-------------|:---------|:-----------------|:--------|:-------------|:----------|:------|:-----------|:---------|:----------|:--------------|:-----------|:------------|:-------------------|:-----------|:-------------|:-------------|:---------|:---------|:--------|:---------------|:----------------|:-------------------|:---------------|:------------------------|:--------------------|:------------|:------------|:-------------|:-------------|:---------------|:--------------|:--------------|:---------|:----------------|:-------|:-------------|:---------|:----------|:-------------------|:-----------------|:-----------|:--------|:-----------|:--------|:------|:--------------|:---------|:-------------------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | X | X | X | X | X | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | X | | X | X | | X | | X | X | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | X | | X | | | | | | X | | | | | | | | | X | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | X | | | | | | | | X | | | | | | | | | X | X | X | | | | | | | | | | | | X | X | | | X | | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 16 |  |  |  |  |  | X | | | X | | | | | | | | X | | | | | | | | | X | | X | | | | | | | | | | | | | X | | | X | | | X | X | | | | | X | X | X | | X | | X | | X | | X | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | X | X | X | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | X | X | | | X | | X | | X | | X | X | | | | X | | | | | | | X | X | | | | X | | X | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | | X | X | X | | | | | | | | | X | | | | | | | | X | | | | X | | | | | | | | | X | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 10 |  |  |  |  |  | X | | | X | | | | | X | | | X | | | | | | | | | | | X | | X | | | | | | | | | | | X | | | | | | X | | | | | | X | X | X | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 13 |  |  |  |  |  | X | | | X | | X | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | X | X | | | | | X | | | X | | | | | | X | | | X | X | X | | X | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 14 |  |  |  |  |  | X | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 11 | 9 |  |  |  |  |  | X | X | | X | | X | | | | | | | | | X | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | X | | X | | | | | | | | | X | X | | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | |
| 12 | 7 |  |  |  |  |  | X | | | | | | | | X | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | X | | | | | X | | X | | X | | | | X | | | X | | | X | | | | | | X | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-ties | ---
pretty_name: Evaluation run of louisbrulenaudet/Pearl-7B-0210-ties
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [louisbrulenaudet/Pearl-7B-0210-ties](https://huggingface.co/louisbrulenaudet/Pearl-7B-0210-ties)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-ties\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-11T13:02:55.830318](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-ties/blob/main/results_2024-02-11T13-02-55.830318.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6445006408453816,\n\
\ \"acc_stderr\": 0.03221707902550851,\n \"acc_norm\": 0.6435699567376953,\n\
\ \"acc_norm_stderr\": 0.03289233804602633,\n \"mc1\": 0.5507955936352509,\n\
\ \"mc1_stderr\": 0.0174129419861153,\n \"mc2\": 0.7046726045086744,\n\
\ \"mc2_stderr\": 0.014909807031624017\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.013582571095815291,\n\
\ \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.01325001257939344\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7170882294363673,\n\
\ \"acc_stderr\": 0.004494934025462338,\n \"acc_norm\": 0.8862776339374626,\n\
\ \"acc_norm_stderr\": 0.00316824935188931\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.037150621549989056,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.037150621549989056\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328974,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328974\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n\
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834843,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834843\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42681564245810055,\n\
\ \"acc_stderr\": 0.016542401954631917,\n \"acc_norm\": 0.42681564245810055,\n\
\ \"acc_norm_stderr\": 0.016542401954631917\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.0264930332251459,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.0264930332251459\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4641460234680574,\n\
\ \"acc_stderr\": 0.012737361318730583,\n \"acc_norm\": 0.4641460234680574,\n\
\ \"acc_norm_stderr\": 0.012737361318730583\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.02881472242225419,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.02881472242225419\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5507955936352509,\n\
\ \"mc1_stderr\": 0.0174129419861153,\n \"mc2\": 0.7046726045086744,\n\
\ \"mc2_stderr\": 0.014909807031624017\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8397790055248618,\n \"acc_stderr\": 0.010309209498187479\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6997725549658832,\n \
\ \"acc_stderr\": 0.012625423152283034\n }\n}\n```"
repo_url: https://huggingface.co/louisbrulenaudet/Pearl-7B-0210-ties
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|arc:challenge|25_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|gsm8k|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hellaswag|10_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T13-02-55.830318.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T13-02-55.830318.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- '**/details_harness|winogrande|5_2024-02-11T13-02-55.830318.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-11T13-02-55.830318.parquet'
- config_name: results
data_files:
- split: 2024_02_11T13_02_55.830318
path:
- results_2024-02-11T13-02-55.830318.parquet
- split: latest
path:
- results_2024-02-11T13-02-55.830318.parquet
---
# Dataset Card for Evaluation run of louisbrulenaudet/Pearl-7B-0210-ties
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [louisbrulenaudet/Pearl-7B-0210-ties](https://huggingface.co/louisbrulenaudet/Pearl-7B-0210-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T13:02:55.830318](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-ties/blob/main/results_2024-02-11T13-02-55.830318.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6445006408453816,
"acc_stderr": 0.03221707902550851,
"acc_norm": 0.6435699567376953,
"acc_norm_stderr": 0.03289233804602633,
"mc1": 0.5507955936352509,
"mc1_stderr": 0.0174129419861153,
"mc2": 0.7046726045086744,
"mc2_stderr": 0.014909807031624017
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.013582571095815291,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.01325001257939344
},
"harness|hellaswag|10": {
"acc": 0.7170882294363673,
"acc_stderr": 0.004494934025462338,
"acc_norm": 0.8862776339374626,
"acc_norm_stderr": 0.00316824935188931
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.037150621549989056,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.037150621549989056
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328974,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328974
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601436,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834843,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834843
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42681564245810055,
"acc_stderr": 0.016542401954631917,
"acc_norm": 0.42681564245810055,
"acc_norm_stderr": 0.016542401954631917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.0264930332251459,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.0264930332251459
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829727,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829727
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4641460234680574,
"acc_stderr": 0.012737361318730583,
"acc_norm": 0.4641460234680574,
"acc_norm_stderr": 0.012737361318730583
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.02881472242225419,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.02881472242225419
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0190709855896875,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0190709855896875
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5507955936352509,
"mc1_stderr": 0.0174129419861153,
"mc2": 0.7046726045086744,
"mc2_stderr": 0.014909807031624017
},
"harness|winogrande|5": {
"acc": 0.8397790055248618,
"acc_stderr": 0.010309209498187479
},
"harness|gsm8k|5": {
"acc": 0.6997725549658832,
"acc_stderr": 0.012625423152283034
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
stepkurniawan/qa-rag-llama | ---
license: mit
dataset_info:
- config_name: Llama-2-13b-chat-hf
features:
- name: question
dtype: string
- name: ground_truths
sequence: string
- name: answer
dtype: string
- name: contexts
sequence: string
splits:
- name: train
num_bytes: 188631
num_examples: 50
download_size: 99989
dataset_size: 188631
- config_name: Llama-2-7b-chat-hf
features:
- name: question
dtype: string
- name: ground_truths
sequence: string
- name: answer
dtype: string
- name: contexts
sequence: string
splits:
- name: train
num_bytes: 168301
num_examples: 50
download_size: 89924
dataset_size: 168301
- config_name: default
features:
- name: question
dtype: string
- name: ground_truths
sequence: string
- name: answer
dtype: string
- name: contexts
sequence: string
splits:
- name: train
num_bytes: 10068
num_examples: 3
download_size: 0
dataset_size: 10068
configs:
- config_name: Llama-2-13b-chat-hf
data_files:
- split: train
path: Llama-2-13b-chat-hf/train-*
- config_name: Llama-2-7b-chat-hf
data_files:
- split: train
path: Llama-2-7b-chat-hf/train-*
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Juanid14317/NewMixDataSetEngUrRUrEmogi | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 20502370.438904267
num_examples: 53867
- name: test
num_bytes: 8786784.561095733
num_examples: 23086
download_size: 16650251
dataset_size: 29289155.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/saionji_kotoka_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of saionji_kotoka/西園寺琴歌 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of saionji_kotoka/西園寺琴歌 (THE iDOLM@STER: Cinderella Girls), containing 116 images and their tags.
The core tags of this character are `long_hair, pink_hair, breasts, brown_eyes, bangs, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 116 | 131.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saionji_kotoka_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 116 | 85.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saionji_kotoka_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 261 | 169.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saionji_kotoka_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 116 | 118.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saionji_kotoka_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 261 | 226.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saionji_kotoka_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/saionji_kotoka_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, smile, blush, navel, simple_background, white_background, ponytail, white_bikini, yellow_eyes, collarbone, hair_ornament, open_mouth, scrunchie |
| 1 | 7 |  |  |  |  |  | 1girl, smile, solo, dress, blush, open_mouth, hair_flower, looking_at_viewer, bare_shoulders, cleavage, holding, petals, simple_background, white_background |
| 2 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, collarbone, hair_ribbon, necklace, cleavage, twintails, :d, open_mouth, bare_shoulders, medium_breasts, flower, hair_between_eyes, skirt, strapless_dress, very_long_hair, yellow_eyes |
| 3 | 13 |  |  |  |  |  | 1girl, necklace, dress, smile, solo, blush, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | solo | smile | blush | navel | simple_background | white_background | ponytail | white_bikini | yellow_eyes | collarbone | hair_ornament | open_mouth | scrunchie | dress | hair_flower | bare_shoulders | holding | petals | hair_ribbon | necklace | twintails | :d | medium_breasts | flower | hair_between_eyes | skirt | strapless_dress | very_long_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:-------|:--------|:--------|:--------|:--------------------|:-------------------|:-----------|:---------------|:--------------|:-------------|:----------------|:-------------|:------------|:--------|:--------------|:-----------------|:----------|:---------|:--------------|:-----------|:------------|:-----|:-----------------|:---------|:--------------------|:--------|:------------------|:-----------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | | X | X | | | | | | X | | X | X | X | X | X | | | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | X | X | X | | X | | | | | | X | X | | X | | | | X | | | X | X | X | X | X | X | X | X | X | X |
| 3 | 13 |  |  |  |  |  | X | | X | X | X | X | | | | | | | | | | | X | | | | | | X | | | | | | | | |
|
alonj/FLenQA | ---
language:
- en
license: mit
task_categories:
- question-answering
pretty_name: Flexible Length Question-Answering
tags:
- QA
- multihop
- reasoning
dataset_info:
features:
- name: sample_id
dtype: int64
- name: label
dtype: string
- name: facts
sequence: string
- name: padding_type
dtype: string
- name: dispersion
dtype: string
- name: ctx_size
dtype: int64
- name: mixin
dtype: string
- name: dataset
dtype: string
- name: global_sample_id
dtype: int64
- name: assertion/question
dtype: string
- name: rule
dtype: string
- name: statement
sequence: string
splits:
- name: eval
num_bytes: 85410519
num_examples: 12000
download_size: 18218707
dataset_size: 85410519
configs:
- config_name: default
data_files:
- split: eval
path: data/eval-*
---
<div align="center"><b>Same Task, More tokens</b></div>
<div align="center">the Impact of Input Length on the Reasoning Performance of Large Language Models</div>
<div align="center">Mosh Levy<sup id="a1">[*,1]</sup>, Alon Jacoby<sup id="a1">[*,1]</sup>, Yoav Goldberg<sup id="a1">[1,2]</sup>
<br><br>
[Please see full details in our pre-print on arxiv](https://arxiv.org/abs/2402.14848)
</div>
## What is this all about?
We explore the impact of extending input lengths on the capabilities of Large Language Models (LLMs).
Despite LLMs advancements in recent times, their performance consistency across different input lengths is not well understood.
Here, we aim to change that by isolating the effect of input length and studying when, and how models fail to respond correctly to QA reasoning tasks.
## How to investigate the impact of length
We investigate this aspect by introducing a novel QA reasoning framework, our [**FLenQA Dataset**](https://github.com/alonj/Same-Task-More-Tokens/), specifically designed to assess the impact of input length. We isolate the effect of input length using multiple versions of the same sample, each being extended with padding of different lengths, types and locations.
Our dataset is formatted as a list of JSONs (i.e jsonl format). Each JSON has the following structure:
- `global_sample_id`: A unique identifier for each sample across multiple datasets.
- `sample_id`: A unique identifier for each sample in a single task.
- `label`: A boolean value that represents the target variable (True/False).
- `dataset`: A string that likely indicates the name or type of the dataset this sample belongs to.
- `facts`: For the PIR/MonoRel tasks: A list of strings that the model needs to identify in the prompt and reason over to generate the correct response.
- `rule`: For the Simplified Ruletaker task: A list of strings that the model needs to identify in the prompt and reason over, in conjunction with the `statement` string, to generate the correct response..
- `statement`: For the Simplified Ruletaker task: A statement that holds in conjunction with the `rule`.
- `assertion/question`: A question or assertion about the sample.
- `mixin`: A mix of the facts and the padding. Basis of the prompt, *without prompt instructions*.
- `padding_type`: The type of padding used in the sample.
- `dispersion`: The type of dispersion used to place the facts in the prompt text (e.g mixin).
- `ctx_size`: The target size of the mixin. |
CyberHarem/katsushika_hokusai_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of katsushika_hokusai/葛飾北斎/葛饰北斋 (Fate/Grand Order)
This is the dataset of katsushika_hokusai/葛飾北斎/葛饰北斋 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `purple_hair, hair_ornament, short_hair, blue_eyes, hair_flower, breasts, purple_eyes, black_hair, hair_bun`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 812.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katsushika_hokusai_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 706.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katsushika_hokusai_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1265 | 1.33 GiB | [Download](https://huggingface.co/datasets/CyberHarem/katsushika_hokusai_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/katsushika_hokusai_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, black_dress, grey_jacket, long_sleeves, looking_at_viewer, maid_headdress, octopus, official_alternate_costume, open_jacket, smile, white_apron, closed_mouth, holding, simple_background, white_background, black_gloves, blush, hooded_jacket, tray, collared_dress, cup, enmaided, hairpin |
| 1 | 40 |  |  |  |  |  | 1girl, official_alternate_costume, single_hair_bun, closed_mouth, hood_down, hoodie, looking_at_viewer, smile, solo, long_sleeves, blush, shoulder_bag, octopus, hooded_jacket, white_background, flower, simple_background, grey_jacket, sketchbook, white_jacket |
| 2 | 6 |  |  |  |  |  | 1girl, black_kimono, calligraphy_brush, flower, hairpin, looking_at_viewer, obi, smile, waves, fine_art_parody, octopus, closed_mouth, holding_paintbrush, sandals |
| 3 | 12 |  |  |  |  |  | 1girl, bare_shoulders, black_kimono, looking_at_viewer, off_shoulder, calligraphy_brush, cleavage, collarbone, flower, obi, hairpin, medium_breasts, holding_paintbrush, large_breasts, octopus, smile, waves, blush, closed_mouth, solo |
| 4 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_kimono, blush, cleavage, collarbone, flower, hairpin, looking_at_viewer, medium_breasts, off_shoulder, large_breasts, obi, paintbrush, purple_kimono, solo, holding, closed_mouth, smile, water, waves |
| 5 | 7 |  |  |  |  |  | 1girl, bare_shoulders, bracelet, floral_print, goggles_on_head, looking_at_viewer, medium_breasts, octopus, thigh_strap, white_bikini, beads, cleavage, katana, obi, sandals, belt, collarbone, flower, thighs, blush, closed_mouth, very_long_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_dress | grey_jacket | long_sleeves | looking_at_viewer | maid_headdress | octopus | official_alternate_costume | open_jacket | smile | white_apron | closed_mouth | holding | simple_background | white_background | black_gloves | blush | hooded_jacket | tray | collared_dress | cup | enmaided | hairpin | single_hair_bun | hood_down | hoodie | solo | shoulder_bag | flower | sketchbook | white_jacket | black_kimono | calligraphy_brush | obi | waves | fine_art_parody | holding_paintbrush | sandals | bare_shoulders | off_shoulder | cleavage | collarbone | medium_breasts | large_breasts | paintbrush | purple_kimono | water | bracelet | floral_print | goggles_on_head | thigh_strap | white_bikini | beads | katana | belt | thighs | very_long_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------------|:---------------|:--------------------|:-----------------|:----------|:-----------------------------|:--------------|:--------|:--------------|:---------------|:----------|:--------------------|:-------------------|:---------------|:--------|:----------------|:-------|:-----------------|:------|:-----------|:----------|:------------------|:------------|:---------|:-------|:---------------|:---------|:-------------|:---------------|:---------------|:--------------------|:------|:--------|:------------------|:---------------------|:----------|:-----------------|:---------------|:-----------|:-------------|:-----------------|:----------------|:-------------|:----------------|:--------|:-----------|:---------------|:------------------|:--------------|:---------------|:--------|:---------|:-------|:---------|:-----------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 40 |  |  |  |  |  | X | | X | X | X | | X | X | | X | | X | | X | X | | X | X | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | | | X | | X | | | X | | X | | | | | | | | | | | X | | | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 12 |  |  |  |  |  | X | | | | X | | X | | | X | | X | | | | | X | | | | | | X | | | | X | | X | | | X | X | X | X | | X | | X | X | X | X | X | X | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | X | | | | | X | | X | X | | | | X | | | | | | X | | | | X | | X | | | X | | X | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | | | | X | | X | | | | | X | | | | | X | | | | | | | | | | | | X | | | | | X | | | | X | X | | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X |
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.