datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
ginazhouhuiwu/dlmd | ---
task_categories:
- image-to-image
size_categories:
- 10K<n<100K
dataset_info:
features:
- name: diffuser
dtype: image
- name: lensed
dtype: image
splits:
- name: train
num_bytes: 373727531.5803672
num_examples: 24900
- name: validation
num_bytes: 1599916.8196327854
num_examples: 99
download_size: 355299926
dataset_size: 375327448.4
---
|
detectors/rademacher-ood | ---
license: unknown
size_categories: 10K<n<100K
task_categories:
- image-classification
pretty_name: Rademacher noise
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 333318820.0
num_examples: 10000
download_size: 333386324
dataset_size: 333318820.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for Rademacher noise for OOD Detection
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Original Dataset Authors**: [More Information Needed]
- **OOD Split Authors:** Dan Hendrycks, Mantas Mazeika, Thomas Dietterich
- **Shared by:** Eduardo Dadalto
- **License:** unknown
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Original Dataset Paper:** [More Information Needed]
- **First OOD Application Paper:** http://arxiv.org/abs/1812.04606v3
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
This dataset is intended to be used as an ouf-of-distribution dataset for image classification benchmarks.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
This dataset is not annotated.
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
The goal in curating and sharing this dataset to the HuggingFace Hub is to accelerate research and promote reproducibility in generalized Out-of-Distribution (OOD) detection.
Check the python library [detectors](https://github.com/edadaltocg/detectors) if you are interested in OOD detection.
### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
Please check original paper for details on the dataset.
### Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
Please check original paper for details on the dataset.
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```bibtex
@software{detectors2023,
author = {Eduardo Dadalto},
title = {Detectors: a Python Library for Generalized Out-Of-Distribution Detection},
url = {https://github.com/edadaltocg/detectors},
doi = {https://doi.org/10.5281/zenodo.7883596},
month = {5},
year = {2023}
}
@article{1812.04606v3,
author = {Dan Hendrycks and Mantas Mazeika and Thomas Dietterich},
title = {Deep Anomaly Detection with Outlier Exposure},
year = {2018},
month = {12},
note = {ICLR 2019; PyTorch code available at
https://github.com/hendrycks/outlier-exposure},
archiveprefix = {arXiv},
url = {http://arxiv.org/abs/1812.04606v3}
}
```
## Dataset Card Authors
Eduardo Dadalto
## Dataset Card Contact
https://huggingface.co/edadaltocg |
Coooori/dialog_data_test_hf | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 165142
num_examples: 99
download_size: 95370
dataset_size: 165142
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dialog_data_test_hf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hareshgautham/detect_solar_dust | ---
task_categories:
- image-classification
language:
- en
size_categories:
- n<1K
--- |
school-knight/MFT_NEWS | ---
license: apache-2.0
---
Dataset for “One Consensus, Diverse Expressions:Ethical Spectrum Analysis of the 'Carbon' Issue in the Global News Database” (submitted to ICA 2024) |
bjoernp/tagesschau-2018-2023 | ---
dataset_info:
features:
- name: date
dtype: string
- name: headline
dtype: string
- name: short_headline
dtype: string
- name: short_text
dtype: string
- name: article
dtype: string
- name: link
dtype: string
splits:
- name: train
num_bytes: 107545823
num_examples: 21847
download_size: 63956047
dataset_size: 107545823
language:
- de
size_categories:
- 10K<n<100K
---
# Tagesschau Archive Article Dataset
A scrape of Tagesschau.de articles from 01.01.2018 to 26.04.2023. Find all source code in [github.com/bjoernpl/tagesschau](https://github.com/bjoernpl/tagesschau).
## Dataset Information
CSV structure:
| Field | Description |
| --- | --- |
| `date` | Date of the article |
| `headline` | Title of the article |
| `short_headline` | A short headline / Context |
| `short_text` | A brief summary of the article |
| `article` | The full text of the article |
| `href` | The href of the article on tagesschau.de |
Size:
The final dataset (2018-today) contains 225202 articles from 1942 days. Of these articles only
21848 are unique (Tagesschau often keeps articles in circulation for ~1 month). The total download
size is ~65MB.
Cleaning:
- Duplicate articles are removed
- Articles with empty text are removed
- Articles with empty short_texts are removed
- Articles, headlines and short_headlines are stripped of leading and trailing whitespace
More details in [`clean.py`](https://github.com/bjoernpl/tagesschau/blob/main/clean.py). |
veonua/youtubelinks_metadata | ---
license: mit
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/d78679c7 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1340
dataset_size: 182
---
# Dataset Card for "d78679c7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kjj0/cifar10-multirun-logits-60k | ---
license: mit
---
# cifar10-multirun-logits-60k
This repo contains the logit outputs produced by 61,565 independently and identically trained ResNets on the CIFAR-10 test-set.
To plot the histogram of accuracies across the first 500 trained models, run the following:
```
import numpy as np
import matplotlib.pyplot as plt
from huggingface_hub import HfApi
api = HfApi()
logits_path = api.hf_hub_download('kjj0/cifar10-multirun-logits-60k', repo_type='dataset', filename='logits500.npy')
labels_path = api.hf_hub_download('kjj0/cifar10-multirun-logits-60k', repo_type='dataset', filename='labels.npy')
logits = np.load(logits_path)
labels = np.load(labels_path)
acc = (logits.argmax(-1) == labels).mean(1)
plt.hist(acc, bins=16, range=(0.936, 0.952))
plt.xlabel('test-set accuracy')
plt.ylabel('frequency')
plt.title('500 runs of training')
plt.show()
```
<img src="acc500.png" alt="500 runs" width="600"/>
To plot the full histogram across all 61,565 runs, replace `logits500.npy` with `logits.npy` (12GB).
<img src="acc60k.png" alt="60k runs" width="600"/>
## Further detail
The file `logits.npy` is an fp16 tensor of shape `(61565, 10000, 10)`, where e.g. `logits[34211, 2341, 0]`
is the first logit (corresponding to the `airplane` class) predicted by the 34,211th trained model
on the 2,341th example in the CIFAR-10 test-set.
It was generated by using 1,000 A100-hours to run
[this training script](https://github.com/KellerJordan/cifar10-loader/blob/master/example_training/train.py) 61,565 times.
Each run used identical hyperparameters but with varied training stochasticity (model initialization, data order, and data augmentations).
This tensor can be used to learn various pieces of information about the statistical nature of neural network training.
We have extracted what seemed to be a few higher-order bits in [this paper](https://arxiv.org/abs/2304.01910). There is more to discover.
The file `labels.npy` of shape `(10000,)` is the list of labels between `0` and `9` for each of the 10,000 examples.
We use the same ordering of CIFAR-10 examples as `torchvision.datasets.CIFAR10`:
```
import numpy as np
import torchvision
from huggingface_hub import HfApi
api = HfApi()
path = api.hf_hub_download('kjj0/cifar10-multirun-logits-60k', repo_type='dataset', filename='labels.npy')
labels = np.load(path)
torchvision_labels = np.array(torchvision.datasets.CIFAR10('/tmp', train=False).targets)
assert (labels == torchvision_labels).all() # True
```
So to recover the i-th example, use the following:
```
image, label = torchvision.datasets.CIFAR10('/tmp', train=False)[i]
```
---
## New information
What curve will the following code produce? ([answer](https://huggingface.co/datasets/kjj0/cifar10-multirun-logits-60k/blob/main/airplane_knn_curve.png))
```
import numpy as np
from tqdm import tqdm
import matplotlib.pyplot as plt
import torch
from huggingface_hub import HfApi
api = HfApi()
logits_path = api.hf_hub_download('kjj0/cifar10-multirun-logits-60k', repo_type='dataset', filename='logits.npy')
labels_path = api.hf_hub_download('kjj0/cifar10-multirun-logits-60k', repo_type='dataset', filename='labels.npy')
logits0 = torch.tensor(np.load(logits_path))
labels = torch.tensor(np.load(labels_path))
mm = torch.logspace(0, 5, 20).long()
mm[-1] = len(logits0)
k = 15
accs = []
for m in tqdm(mm):
# get airplane logits from ensemble of size m
logits = logits0[:m, :, 0].cuda().float()
# calculate correlations between examples
logits_norm = (logits - logits.mean(0, keepdim=True)) / logits.std(0, keepdim=True)
corr = (logits_norm.T @ logits_norm) / len(logits_norm)
# calculate knn accuracy
corr_nodiag = corr - 1000 * torch.eye(len(corr)).cuda() # remove diagonal
idxs = corr_nodiag.topk(k=k, dim=1).indices.cpu()
pred = labels[idxs].mode(dim=1).values
acc = (pred == labels).float().mean().item()
accs.append(acc)
plt.plot(mm, accs)
plt.xlabel('number of models in ensemble')
plt.ylabel('accuracy')
plt.title('knn (k=%d) accuracy on just airplane logit' % k)
plt.xscale('log')
plt.ylim(0, 1)
plt.show()
```
?????????
```
import numpy as np
from tqdm import tqdm
import torch
from huggingface_hub import HfApi
api = HfApi()
logits_path = api.hf_hub_download('kjj0/cifar10-multirun-logits-60k', repo_type='dataset', filename='logits.npy')
labels_path = api.hf_hub_download('kjj0/cifar10-multirun-logits-60k', repo_type='dataset', filename='labels.npy')
logits0 = torch.tensor(np.load(logits_path))
labels = torch.tensor(np.load(labels_path))
m = 12000 # number of models
logits1 = logits0[:m].float()
print('ensemble accuracy:', (logits1.mean(0).argmax(1) == labels).float().mean())
logits_logsoftmax = logits1.log_softmax(-1)
n = logits1.shape[1]
corr_all = torch.zeros(n, n).cuda()
for c in tqdm(range(10)):
logits = logits_logsoftmax[:, :, c].cuda()
# normalize
logits -= logits.mean(0, keepdim=True)
logits -= logits.mean(1, keepdim=True)
logits /= logits.std(0, keepdim=True)
corr = (logits.T @ logits) / len(logits)
corr_all += corr
corr_nodiag = corr_all - 1e9 * torch.eye(n).cuda() # remove diagonal
nearest_nbr = corr_nodiag.argmax(1).cpu()
assert (nearest_nbr != torch.arange(n)).all() # we're not just somehow reading out the ensemble prediction!
print('kNN accuracy (k=1):', (labels[nearest_nbr] == labels).float().mean())
res = corr_nodiag.topk(k=10, dim=1)
yy = F.one_hot(labels[res.indices.cpu()]).cuda() # labels of nns
yy1 = yy * res.values[..., None]**4 # some kind of sparsity
pred = yy1.mean(1).argmax(1).cpu()
print('weighted kNN accuracy (k=10):', (pred == labels).float().mean())
```
|
lhallee/uniref_small | ---
dataset_info:
features:
- name: uniref
dtype: string
splits:
- name: train
num_bytes: 20739509
num_examples: 100000
download_size: 20824692
dataset_size: 20739509
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "uniref_small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BundaSuja/RenanSantos | ---
license: openrail
---
|
hope_edi | ---
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
language:
- en
- ml
- ta
license:
- cc-by-4.0
multilinguality:
- monolingual
- multilingual
size_categories:
- 10K<n<100K
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids: []
paperswithcode_id: hopeedi
pretty_name: 'HopeEDI: A Multilingual Hope Speech Detection Dataset for Equality,
Diversity, and Inclusion'
tags:
- hope-speech-classification
dataset_info:
- config_name: english
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': Hope_speech
'1': Non_hope_speech
'2': not-English
splits:
- name: train
num_bytes: 2306656
num_examples: 22762
- name: validation
num_bytes: 288663
num_examples: 2843
download_size: 2739901
dataset_size: 2595319
- config_name: tamil
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': Hope_speech
'1': Non_hope_speech
'2': not-Tamil
splits:
- name: train
num_bytes: 1531013
num_examples: 16160
- name: validation
num_bytes: 197378
num_examples: 2018
download_size: 1795767
dataset_size: 1728391
- config_name: malayalam
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': Hope_speech
'1': Non_hope_speech
'2': not-malayalam
splits:
- name: train
num_bytes: 1492031
num_examples: 8564
- name: validation
num_bytes: 180713
num_examples: 1070
download_size: 1721534
dataset_size: 1672744
config_names:
- english
- malayalam
- tamil
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Hope Speech Detection for Equality, Diversity, and Inclusion-EACL 2021](https://competitions.codalab.org/competitions/27653#learn_the_details)
- **Repository:** [HopeEDI data repository](https://competitions.codalab.org/competitions/27653#participate-get_data)
- **Paper:** [HopeEDI: A Multilingual Hope Speech Detection Dataset for Equality, Diversity, and Inclusion](https://www.aclweb.org/anthology/2020.peoples-1.5/)
- **Leaderboard:** [Rank list](https://competitions.codalab.org/competitions/27653#results)
- **Point of Contact:** [Bharathi Raja Chakravarthi](mailto:bharathiraja.akr@gmail.com)
### Dataset Summary
A Hope Speech dataset for Equality, Diversity and Inclusion (HopeEDI) containing user-generated comments from the social media platform YouTube with 28,451, 20,198 and 10,705 comments in English, Tamil and Malayalam, respectively, manually labelled as containing hope speech or not. To our knowledge, this is the first research of its kind to annotate hope speech for equality, diversity and inclusion in a multilingual setting.
### Supported Tasks and Leaderboards
To identify hope speech in the comments/posts in social media.
### Languages
English, Tamil and Malayalam
## Dataset Structure
### Data Instances
An example from the English dataset looks as follows:
| text | label |
| :------ | :----- |
| all lives matter .without that we never have peace so to me forever all lives matter. | Hope_speech |
| I think it's cool that you give people a voice to speak out with here on this channel. | Hope_speech |
An example from the Tamil dataset looks as follows:
| text | label |
| :------ | :----- |
| Idha solla ivalo naala | Non_hope_speech |
| இன்று தேசிய பெண் குழந்தைகள் தினம்.. பெண் குழந்தைகளை போற்றுவோம்..அவர்களை பாதுகாப்போம்... | Hope_speech |
An example from the Malayalam dataset looks as follows:
| text | label |
| :------ | :----- |
| ഇത്രെയും കഷ്ടപ്പെട്ട് വളർത്തിയ ആ അമ്മയുടെ മുഖം കണ്ടപ്പോൾ കണ്ണ് നിറഞ്ഞു പോയി | Hope_speech |
| snehikunavar aanayalum pennayalum onnichu jeevikatte..aareyum compel cheythitallalooo..parasparamulla ishtathodeyalle...avarum jeevikatte..🥰🥰 | Hope_speech |
### Data Fields
English
- `text`: English comment.
- `label`: list of the possible values: "Hope_speech", "Non_hope_speech", "not-English"
Tamil
- `text`: Tamil-English code mixed comment.
- `label`: list of the possible values: "Hope_speech", "Non_hope_speech", "not-Tamil"
Malayalam
- `text`: Malayalam-English code mixed comment.
- `label`: list of the possible values: "Hope_speech", "Non_hope_speech", "not-malayalam"
### Data Splits
| | train | validation |
| ----- |------:|-----------:|
| English | 22762 | 2843 |
| Tamil | 16160 | 2018 |
| Malayalam | 8564 | 1070 |
## Dataset Creation
### Curation Rationale
Hope is considered significant for the well-being, recuperation and restoration of human life by health professionals.
Hate speech or offensive language detection dataset is not available for code-mixed Tamil and code-mixed Malayalam, and it does not take into account LGBTIQ, women in STEM and other minorities. Thus, we cannot use existing hate speech or offensive language detection datasets to detect hope or non-hope for EDI of minorities.
### Source Data
#### Initial Data Collection and Normalization
For English, we collected data on recent topics of EDI, including women in STEM, LGBTIQ issues, COVID-19, Black Lives Matters, United Kingdom (UK) versus China, United States of America (USA) versus China and Australia versus China from YouTube video comments. The data was collected from videos of people from English-speaking countries, such as Australia, Canada, the Republic of Ireland, United Kingdom, the United States of America and New Zealand.
For Tamil and Malayalam, we collected data from India on the recent topics regarding LGBTIQ issues, COVID-19, women in STEM, the Indo-China war and Dravidian affairs.
#### Who are the source language producers?
Youtube users
### Annotations
#### Annotation process
We created Google forms to collect annotations from annotators. Each form contained a maximum of 100 comments, and each page contained a maximum of 10 comments to maintain the quality of annotation. We collected information on the gender, educational background and the medium of schooling of the annotator to know the diversity of the annotator and avoid bias. We educated annotators by providing them with YouTube videos on EDI. A minimum of three annotators annotated each form.
#### Who are the annotators?
For English language comments, annotators were from Australia, the Republic of Ireland, the United Kingdom and the United States of America. For Tamil, we were able to get annotations from both people from the state of Tamil Nadu of India and from Sri Lanka. Most of the annotators were graduate or post-graduate students.
### Personal and Sensitive Information
Social media data is highly sensitive, and even more so when it is related to the minority population, such as the LGBTIQ community or women. We have taken full consideration to minimise the risk associated with individual identity in the data by removing personal information from dataset, such as names but not celebrity names. However, to study EDI, we needed to keep information relating to the following characteristics; racial, gender, sexual orientation, ethnic origin and philosophical beliefs. Annotators were only shown anonymised posts and agreed to make no attempts to contact the comment creator. The dataset will only be made available for research purpose to the researcher who agree to follow ethical
guidelines
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
This work is licensed under a [Creative Commons Attribution 4.0 International Licence](http://creativecommons.org/licenses/by/4.0/.)
### Citation Information
```
@inproceedings{chakravarthi-2020-hopeedi,
title = "{H}ope{EDI}: A Multilingual Hope Speech Detection Dataset for Equality, Diversity, and Inclusion",
author = "Chakravarthi, Bharathi Raja",
booktitle = "Proceedings of the Third Workshop on Computational Modeling of People's Opinions, Personality, and Emotion's in Social Media",
month = dec,
year = "2020",
address = "Barcelona, Spain (Online)",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.peoples-1.5",
pages = "41--53",
abstract = "Over the past few years, systems have been developed to control online content and eliminate abusive, offensive or hate speech content. However, people in power sometimes misuse this form of censorship to obstruct the democratic right of freedom of speech. Therefore, it is imperative that research should take a positive reinforcement approach towards online content that is encouraging, positive and supportive contents. Until now, most studies have focused on solving this problem of negativity in the English language, though the problem is much more than just harmful content. Furthermore, it is multilingual as well. Thus, we have constructed a Hope Speech dataset for Equality, Diversity and Inclusion (HopeEDI) containing user-generated comments from the social media platform YouTube with 28,451, 20,198 and 10,705 comments in English, Tamil and Malayalam, respectively, manually labelled as containing hope speech or not. To our knowledge, this is the first research of its kind to annotate hope speech for equality, diversity and inclusion in a multilingual setting. We determined that the inter-annotator agreement of our dataset using Krippendorff{'}s alpha. Further, we created several baselines to benchmark the resulting dataset and the results have been expressed using precision, recall and F1-score. The dataset is publicly available for the research community. We hope that this resource will spur further research on encouraging inclusive and responsive speech that reinforces positiveness.",
}
```
### Contributions
Thanks to [@jamespaultg](https://github.com/jamespaultg) for adding this dataset. |
neelblabla/enron_labeled_emails_with_subjects-llama2-7b_finetuning | ---
task_categories:
- text-classification
language:
- en
pretty_name: enron(unprocessed)_labeled_prompts
size_categories:
- 1K<n<10K
--- |
result-muse256-muse512-wuerst-sdv15/3677a860 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 240
num_examples: 10
download_size: 1441
dataset_size: 240
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "3677a860"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kgr123/quality_counter_2500_4_buckets | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 13949913
num_examples: 1929
- name: train
num_bytes: 13811555
num_examples: 1935
- name: validation
num_bytes: 14100212
num_examples: 1941
download_size: 9370743
dataset_size: 41861680
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Bingsu/openwebtext_20p | ---
annotations_creators:
- no-annotation
language:
- en
language_creators:
- found
license:
- cc0-1.0
multilinguality:
- monolingual
paperswithcode_id: openwebtext
pretty_name: openwebtext_20p
size_categories:
- 1M<n<10M
source_datasets:
- extended|openwebtext
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
---
# openwebtext_20p
## Dataset Description
- **Origin:** [openwebtext](https://huggingface.co/datasets/openwebtext)
- **Download Size** 4.60 GiB
- **Generated Size** 7.48 GiB
- **Total Size** 12.08 GiB
first 20% of [openwebtext](https://huggingface.co/datasets/openwebtext) |
erhwenkuo/wikipedia-zhtw | ---
dataset_info:
config_name: '20231001'
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1682641991
num_examples: 1373081
download_size: 1064907519
dataset_size: 1682641991
configs:
- config_name: '20231001'
data_files:
- split: train
path: 20231001/train-*
license: cc-by-sa-3.0
task_categories:
- text-generation
- fill-mask
language:
- zh
size_categories:
- 1M<n<10M
---
# Dataset Card for "wikipedia-zhtw"
維基百科數據集包含許多不同語言的文章。這個數據集是根據 Wikipedia dumps (https://dumps.wikimedia.org/) 裡頭 `zhwiki` 的中文下載檔案來建構的。每個範例都包含一篇完整的維基百科文章的內容,並經過清理以去除不需要的部分(例如參考文獻等)。
- **Homepage:** [https://dumps.wikimedia.org](https://dumps.wikimedia.org)
- **zhwiki 下載點:** [https://dumps.wikimedia.org/zhwiki](https://dumps.wikimedia.org/zhwiki)
## 數據 Dump 版本
由於維基百科數據集定期會進行網站數據拋轉,在 `2023/10/10` 的時間點去查看時會有下列的數據可供下載:
|數據 Dump 目錄|拋轉時間點|
|-------------|--------|
|`20230620/`|01-Aug-2023 09:31|
|`20230701/`|20-Aug-2023 09:41|
|`20230720/`|01-Sep-2023 09:31|
|`20230801/`|20-Sep-2023 09:38|
|`20230820/`|01-Oct-2023 09:34|
|`20230901/`|04-Sep-2023 21:18|
|`20230920/`|22-Sep-2023 01:59|
|`20231001/`|10-Oct-2023 02:55|
|`latest/`|10-Oct-2023 02:55|
本數據集會定期去取得最近有明確的日期來進行下載與清理,便於驗證與使用。
## 數據下載清理
1. 下載 zhwiki 的 data dump 檔案
2. 使用 [WikiExtractor](https://github.com/attardi/wikiextractor) 套件來進行文件內容萃取
3. 進行數據清理并轉換成 jsonl 格式檔案
4. 使用 Huggingface [Datasets](https://pypi.org/project/datasets/) 套件來載入 jsonl 并上傳至 Huggingface Hub
## 資料集結構
範例如下:
{'id': '333',
'url': 'https://zh.wikipedia.org/wiki?curid=333',
'title': '鄧麗君',
'text': '鄧麗君,臺灣歌手、演員及慈善家,本名鄧麗筠。她是20世紀後期華語流行音樂具代表性的人物...'
}
## 資料欄位
所有配置中的資料欄位都是相同的:
- `id (str)`: 文章的 ID。
- `url (str)`: 文章的 URL。
- `title (str)`: 文章的標題。
- `text (str)`: 文章的文字內容。
## 使用
```python
from datasets import load_dataset
# 請在第二個參數去指定要使用的數據 dump 的日期
load_dataset("erhwenkuo/wikipedia-zhtw", "20231001")
```
## 許可資訊
維基百科的大部分文章內容及其許多圖像均根據 `Creative Commons Attribution-ShareAlike 3.0 Unported License (CC BY-SA)` 和 `GNU Free Documentation License (GFDL)` 共同授權。
## Citation
```
@ONLINE{wikidump,
author = "Wikimedia Foundation",
title = "Wikimedia Downloads",
url = "https://dumps.wikimedia.org"
}
``` |
Rane7/TLRM_Dataset | ---
license: mit
---
|
casino | ---
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- conversational
- text-generation
- fill-mask
task_ids:
- dialogue-modeling
paperswithcode_id: casino
pretty_name: Campsite Negotiation Dialogues
dataset_info:
features:
- name: chat_logs
list:
- name: text
dtype: string
- name: task_data
struct:
- name: data
dtype: string
- name: issue2youget
struct:
- name: Firewood
dtype: string
- name: Water
dtype: string
- name: Food
dtype: string
- name: issue2theyget
struct:
- name: Firewood
dtype: string
- name: Water
dtype: string
- name: Food
dtype: string
- name: id
dtype: string
- name: participant_info
struct:
- name: mturk_agent_1
struct:
- name: value2issue
struct:
- name: Low
dtype: string
- name: Medium
dtype: string
- name: High
dtype: string
- name: value2reason
struct:
- name: Low
dtype: string
- name: Medium
dtype: string
- name: High
dtype: string
- name: outcomes
struct:
- name: points_scored
dtype: int32
- name: satisfaction
dtype: string
- name: opponent_likeness
dtype: string
- name: demographics
struct:
- name: age
dtype: int32
- name: gender
dtype: string
- name: ethnicity
dtype: string
- name: education
dtype: string
- name: personality
struct:
- name: svo
dtype: string
- name: big-five
struct:
- name: extraversion
dtype: float32
- name: agreeableness
dtype: float32
- name: conscientiousness
dtype: float32
- name: emotional-stability
dtype: float32
- name: openness-to-experiences
dtype: float32
- name: mturk_agent_2
struct:
- name: value2issue
struct:
- name: Low
dtype: string
- name: Medium
dtype: string
- name: High
dtype: string
- name: value2reason
struct:
- name: Low
dtype: string
- name: Medium
dtype: string
- name: High
dtype: string
- name: outcomes
struct:
- name: points_scored
dtype: int32
- name: satisfaction
dtype: string
- name: opponent_likeness
dtype: string
- name: demographics
struct:
- name: age
dtype: int32
- name: gender
dtype: string
- name: ethnicity
dtype: string
- name: education
dtype: string
- name: personality
struct:
- name: svo
dtype: string
- name: big-five
struct:
- name: extraversion
dtype: float32
- name: agreeableness
dtype: float32
- name: conscientiousness
dtype: float32
- name: emotional-stability
dtype: float32
- name: openness-to-experiences
dtype: float32
- name: annotations
list:
list: string
splits:
- name: train
num_bytes: 3211407
num_examples: 1030
download_size: 1247368
dataset_size: 3211407
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for Casino
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [Github: Kushal Chawla CaSiNo](https://github.com/kushalchawla/CaSiNo)
- **Paper:** [CaSiNo: A Corpus of Campsite Negotiation Dialogues for Automatic Negotiation Systems](https://aclanthology.org/2021.naacl-main.254.pdf)
- **Point of Contact:** [Kushal Chawla](kchawla@usc.edu)
### Dataset Summary
We provide a novel dataset (referred to as CaSiNo) of 1030 negotiation dialogues. Two participants take the role of campsite neighbors and negotiate for Food, Water, and Firewood packages, based on their individual preferences and requirements. This design keeps the task tractable, while still facilitating linguistically rich and personal conversations. This helps to overcome the limitations of prior negotiation datasets such as Deal or No Deal and Craigslist Bargain. Each dialogue consists of rich meta-data including participant demographics, personality, and their subjective evaluation of the negotiation in terms of satisfaction and opponent likeness.
### Supported Tasks and Leaderboards
Train end-to-end models for negotiation
### Languages
English
## Dataset Structure
### Data Instances
```
{
"chat_logs": [
{
"text": "Hello! \ud83d\ude42 Let's work together on a deal for these packages, shall we? What are you most interested in?",
"task_data": {},
"id": "mturk_agent_1"
},
...
],
"participant_info": {
"mturk_agent_1":
{
"value2issue": ...
"value2reason": ...
"outcomes": ...
"demographics": ...
"personality": ...
},
"mturk_agent_2": ...
},
"annotations": [
["Hello! \ud83d\ude42 Let's work together on a deal for these packages, shall we? What are you most interested in?", "promote-coordination,elicit-pref"],
...
]
}
```
### Data Fields
- `chat_logs`: The negotiation dialogue between two participants
- `text`: The dialogue utterance
- `task_data`: Meta-data associated with the utterance such as the deal submitted by a participant
- `id`: The ID of the participant who typed this utterance
- `participant_info`: Meta-data about the two participants in this conversation
- `mturk_agent_1`: For the first participant (Note that 'first' is just for reference. There is no order between the participants and any participant can start the conversation)
- `value2issue`: The priority order of this participant among Food, Water, Firewood
- `value2reason`: The personal arguments given by the participants themselves, consistent with the above preference order. This preference order and these arguments were submitted before the negotiation began.
- `outcomes`: The negotiation outcomes for this participant including objective and subjective assessment.
- `demographics`: Demographic attributes of the participant in terms of age, gender, ethnicity, and education.
- `personality`: Personality attributes for this participant, in terms of Big-5 and Social Value Orientation
- `mturk_agent_2`: For the second participant; follows the same structure as above
- `annotations`: Strategy annotations for each utterance in the dialogue, wherever available. The first element represents the utterance and the second represents a comma-separated list of all strategies present in that utterance.
### Data Splits
No default data split has been provided. Hence, all 1030 data points are under the 'train' split.
| | Train |
| ----- | ----- |
| total dialogues | 1030 |
| annotated dialogues | 396 |
## Dataset Creation
### Curation Rationale
The dataset was collected to address the limitations in prior negotiation datasets from the perspective of downstream applications in pedagogy and conversational AI. Please refer to the original paper published at NAACL 2021 for details about the rationale and data curation steps ([source paper](https://aclanthology.org/2021.naacl-main.254.pdf)).
### Source Data
#### Initial Data Collection and Normalization
The dialogues were crowdsourced on Amazon Mechanical Turk. The strategy annotations were performed by expert annotators (first three authors of the paper). Please refer to the original dataset paper published at NAACL 2021 for more details ([source paper](https://aclanthology.org/2021.naacl-main.254.pdf)).
#### Who are the source language producers?
The primary producers are Turkers on Amazon Mechanical Turk platform. Two turkers were randomly paired with each other to engage in a negotiation via a chat interface. Please refer to the original dataset paper published at NAACL 2021 for more details ([source paper](https://aclanthology.org/2021.naacl-main.254.pdf)).
### Annotations
#### Annotation process
From the [source paper](https://aclanthology.org/2021.naacl-main.254.pdf) for this dataset:
>Three expert annotators independently annotated 396 dialogues containing 4615 utterances. The annotation guidelines were iterated over a subset of 5 dialogues, while the reliability scores were computed on a different subset of 10 dialogues. We use the nominal form of Krippendorff’s alpha (Krippendorff, 2018) to measure the inter-annotator agreement. We provide the annotation statistics in Table 2. Although we release all the annotations, we skip Coordination and Empathy for our analysis in this work, due to higher subjectivity resulting in relatively lower reliability scores.
#### Who are the annotators?
Three expert annotators (first three authors of the paper).
### Personal and Sensitive Information
All personally identifiable information about the participants such as MTurk Ids or HIT Ids was removed before releasing the data.
## Considerations for Using the Data
### Social Impact of Dataset
Please refer to Section 8.2 in the [source paper](https://aclanthology.org/2021.naacl-main.254.pdf).
### Discussion of Biases
Please refer to Section 8.2 in the [source paper](https://aclanthology.org/2021.naacl-main.254.pdf).
### Other Known Limitations
Please refer to Section 7 in the [source paper](https://aclanthology.org/2021.naacl-main.254.pdf).
## Additional Information
### Dataset Curators
Corresponding Author: Kushal Chawla (`kchawla@usc.edu`)\
Affiliation: University of Southern California\
Please refer to the [source paper](https://aclanthology.org/2021.naacl-main.254.pdf) for the complete author list.
### Licensing Information
The project is licensed under CC-by-4.0
### Citation Information
```
@inproceedings{chawla2021casino,
title={CaSiNo: A Corpus of Campsite Negotiation Dialogues for Automatic Negotiation Systems},
author={Chawla, Kushal and Ramirez, Jaysa and Clever, Rene and Lucas, Gale and May, Jonathan and Gratch, Jonathan},
booktitle={Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies},
pages={3167--3185},
year={2021}
}
```
### Contributions
Thanks to [Kushal Chawla](https://kushalchawla.github.io/) for adding this dataset. |
kaleemWaheed/twitter_dataset_1713146097 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 25314
num_examples: 56
download_size: 12998
dataset_size: 25314
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dfalbel/cran-packages | ---
license: other
task_categories:
- text-generation
language:
- code
pretty_name: cran-packages
size_categories:
- 100K<n<1M
---
## CRAN packages dataset
R and Rmd source codes for CRAN packages.
The dataset has been constructed using the following steps:
- Downloaded latest version from all packages on CRAN (see last updated). The source code has been downloaded from the [GitHub mirror](https://github.com/cran).
- Identified the licenses from each package from their DESCRIPTION file, and classified each of them into some license_code. See the licenses.csv file.
- Extract R and Rmd source files from all packages and joined with the package LICENSES.
Datasets are provided as parquet files containing the following columns:
```
FileSystemDataset with 1 Parquet file
package: string
path: string
content: large_string
size: double
license: string
```
Last updated: Jun 6th 2023
## Changelog
- v1: Initial version
- dev: added all CRAN files and a license field that allows filtering out per license. Also removed some unused columns.
|
hhhwmws/zhaomin | ---
license: cc-by-4.0
task_categories:
- text-generation
language:
- zh
size_categories:
- 1K<n<10K
---
支持ChatHaruhi2 的赵敏数据,可以使用如下方式调用
```python
from chatharuhi import ChatHaruhi
chatbot = ChatHaruhi( role_from_hf = 'hhhwmws/zhaomin', \
llm = 'openai')
response = chatbot.chat(role='张无忌', text = '赵敏!')
print(response)
```
上传者: 米唯实
更具体的信息,见 [ChatHaruhi](https://github.com/LC1332/Chat-Haruhi-Suzumiya)
欢迎加入我们的 [众筹角色创建项目](https://github.com/LC1332/Chat-Haruhi-Suzumiya/tree/main/characters/novel_collecting)
### Citation引用
Please cite the repo if you use the data or code in this repo.
```
@misc{li2023chatharuhi,
title={ChatHaruhi: Reviving Anime Character in Reality via Large Language Model},
author={Cheng Li and Ziang Leng and Chenxi Yan and Junyi Shen and Hao Wang and Weishi MI and Yaying Fei and Xiaoyang Feng and Song Yan and HaoSheng Wang and Linkang Zhan and Yaokai Jia and Pingyu Wu and Haozhen Sun},
year={2023},
eprint={2308.09597},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
muhammadravi251001/idkmrc-nli | ---
annotations_creators:
- machine-generated
- manual-partial-validation
language_creators:
- expert-generated
language:
- id
license: unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- IDK-MRC
task_categories:
- text-classification
task_ids:
- natural-language-inference
pretty_name: IDK-MRC-NLI
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
config_name: idkmrc-nli
splits:
- name: train
num_bytes: 5916125
num_examples: 18665
- name: validation
num_bytes: 473125
num_examples: 1529
- name: test
num_bytes: 521375
num_examples: 1689
download_size: 6910625
dataset_size: 21883
---
# Dataset Card for IDK-MRC-NLI
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [Hugging Face](https://huggingface.co/datasets/muhammadravi251001/idkmrc-nli)
- **Point of Contact:** [Hugging Face](https://huggingface.co/datasets/muhammadravi251001/idkmrc-nli)
- **Experiment:** [Github](https://github.com/muhammadravi251001/multilingual-qas-with-nli)
### Dataset Summary
The IDKMRC-NLI dataset is derived from the IDK-MRC question answering dataset, utilizing named entity recognition (NER), chunking tags, Regex, and embedding similarity techniques to determine its contradiction sets.
Collected through this process, the dataset comprises various columns beyond premise, hypothesis, and label, including properties aligned with NER and chunking tags.
This dataset is designed to facilitate Natural Language Inference (NLI) tasks and contains information extracted from diverse sources to provide comprehensive coverage.
Each data instance encapsulates premise, hypothesis, label, and additional properties pertinent to NLI evaluation.
### Supported Tasks and Leaderboards
- Natural Language Inference for Indonesian
### Languages
Indonesian
## Dataset Structure
### Data Instances
An example of `test` looks as follows.
```
{
"premise": "Karangkancana adalah sebuah kecamatan di Kabupaten Kuningan, Provinsi Jawa Barat, Indonesia.",
"hypothesis": "Dimanakah letak Desa Karang kancana? Kabupaten Kuningan, Provinsi Jawa Barat, Indonesia.",
"label": 0
}
```
### Data Fields
The data fields are:
- `premise`: a `string` feature
- `hypothesis`: a `string` feature
- `label`: a classification label, with possible values including `entailment` (0), `neutral` (1), `contradiction` (2).
### Data Splits #TODO
The data is split across `train`, `valid`, and `test`.
| split | # examples |
|----------|-------:|
|train| 18665|
|valid| 1529|
|test| 1689|
## Dataset Creation
### Curation Rationale
Indonesian NLP is considered under-resourced. We need NLI dataset to fine-tuning the NLI model to utilizing them for QA models in order to improving the performance of the QA's.
### Source Data
#### Initial Data Collection and Normalization
We collect the data from the prominent QA dataset in Indonesian. The annotation fully by the original dataset's researcher.
#### Who are the source language producers?
This synthetic data was produced by machine, but the original data was produced by human.
### Personal and Sensitive Information
There might be some personal information coming from Wikipedia and news, especially the information of famous/important people.
## Considerations for Using the Data
### Discussion of Biases
The QA dataset (so the NLI-derived from them) is created using premise sentences taken from Wikipedia and news. These data sources may contain some bias.
### Other Known Limitations
No other known limitations
## Additional Information
### Dataset Curators
This dataset is the result of the collaborative work of Indonesian researchers from the University of Indonesia, Mohamed bin Zayed University of Artificial Intelligence, and the Korea Advanced Institute of Science & Technology.
### Licensing Information
The license is Unknown. Please contact authors for any information on the dataset. |
H13u/mtet-test | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: test
- name: validation
task_categories:
- translation
language:
- en
- vi
tags:
- LM
size_categories:
- 1M<n<10M
---
# Mtet
- Num examples:
- 5,072 (test)
- 6,212 (validation)
- Language: English, Vietnamese
## Prompts
"Translate the following sentence into <target>: ",
"What is the <target> translation for: ",
"What is the <target> equivalent of: ",
"What does the following sentence means in <target>: ",
"Interpret the following sentence into <target>: ",
"What is the <target> interpretation for: ",
"The <target> translation of the following sentence: ",
"What is the <target> meaning of the following sentence: ",
"What is the <target> meaning of this sentence: ",
"Please translate the following sentence to <target>: "
"Dịch câu sau sang tiếng <target>: ",
"Nghĩa tiếng <target> của câu sau: ",
"Dịch câu tiếng <from> sau sang tiếng <target>: ",
"Thông dịch câu tiếng <from> sau tiếng <target>: ",
"Chuyển câu tiếng <from> sang tiếng <target>: ",
"Chuyển nghĩa câu tiếng <from> sang tiếng <target>: ",
"Câu tiếng <from> có nghĩa là gì trong tiếng <target>: ",
"Câu sau có nghĩa tiếng <target> là gì: ",
"Hãy dịch câu sau sang tiếng <target>: ",
"Giải thích nghĩa câu sau sang tiếng <target>: ",
"Giải thích nghĩa câu tiếng <from> sang tiếng <target>"
|
mcemilg/turkish-plu-step-inference | ---
task_categories:
- text-classification
language:
- tr
size_categories:
- 100K<n<1M
---
Homepage: https://github.com/GGLAB-KU/turkish-plu |
Dampish/Behemoth_QuickTrain_V3 | ---
license: cc-by-nc-4.0
---
|
Patsagorn/pcshsbr-music-request | ---
license: cc-by-4.0
language:
- th
- en
tags:
- music
pretty_name: PCSHSBR Music Request History
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
Request history from [PCSHSBR Music Queue](https://github.com/pcshsbr/music-queue) include timestamp, song's name, and artist.
## Dataset Details
### Dataset Description
- **Curated by:** Patsagorn Yuenyong
- **Shared by [optional]:** PCSHSBR Student Council
- **Language(s) (NLP):** Thai/English
- **License:** CC-BY-4.0
### Dataset Sources [optional]
Human input from [PCSHSBR Music Queue](https://github.com/pcshsbr/music-queue).
## Dataset Structure
It's CSV file originally 4 columns; timestamp, song's name, artist, and note (for disambiguation as we use YouTube to play music). In this dataset
the note column is removed bacause we also use this field to announce who's this song for. By this the column was contain the name of student so
we get rid of it.
## Dataset Creation
### Curation Rationale
Our tool for request and show music queue is collected history in daily basis. But it's nowhere to go so put it here may have some bright mind
doing something with it.
#### Data Collection and Processing
Students and staff are request song using Google Form and [the tool](https://github.com/pcshsbr/music-queue) will collect it in Google Sheet then query
from its table to show. All of request history will stay in the Google Sheet.
#### Personal and Sensitive Information
Due to the large number of row, this dataset may or may not have student name (in form of nickname) contained. However, it is not an identifiable
data hence its safe to put it to public. Unlike note column which is not included in this dataset.
## Dataset Card Contact
- Patsagorn Y. (patsagorn_yue@pccbr.ac.th) |
balaramas/en_hi_st | ---
license: other
---
|
open-llm-leaderboard/details_mlabonne__Beyonder-4x7B-v3 | ---
pretty_name: Evaluation run of mlabonne/Beyonder-4x7B-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mlabonne/Beyonder-4x7B-v3](https://huggingface.co/mlabonne/Beyonder-4x7B-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__Beyonder-4x7B-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T12:17:08.484037](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__Beyonder-4x7B-v3/blob/main/results_2024-03-22T12-17-08.484037.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6545651435392091,\n\
\ \"acc_stderr\": 0.0321355052869314,\n \"acc_norm\": 0.6539201029549403,\n\
\ \"acc_norm_stderr\": 0.032807725306125385,\n \"mc1\": 0.5887392900856793,\n\
\ \"mc1_stderr\": 0.017225627083660874,\n \"mc2\": 0.7444087204884349,\n\
\ \"mc2_stderr\": 0.014280260468011773\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.697098976109215,\n \"acc_stderr\": 0.013428241573185349,\n\
\ \"acc_norm\": 0.7167235494880546,\n \"acc_norm_stderr\": 0.013167478735134575\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7144991037641903,\n\
\ \"acc_stderr\": 0.004507296196227808,\n \"acc_norm\": 0.888568014339773,\n\
\ \"acc_norm_stderr\": 0.0031402323925687967\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438665,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438665\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659807,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659807\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250437,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4111731843575419,\n\
\ \"acc_stderr\": 0.016456498033977512,\n \"acc_norm\": 0.4111731843575419,\n\
\ \"acc_norm_stderr\": 0.016456498033977512\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182653,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182653\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015057,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015057\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5887392900856793,\n\
\ \"mc1_stderr\": 0.017225627083660874,\n \"mc2\": 0.7444087204884349,\n\
\ \"mc2_stderr\": 0.014280260468011773\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237424\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7058377558756633,\n \
\ \"acc_stderr\": 0.012551285331470152\n }\n}\n```"
repo_url: https://huggingface.co/mlabonne/Beyonder-4x7B-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|arc:challenge|25_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|gsm8k|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hellaswag|10_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T12-17-08.484037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T12-17-08.484037.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- '**/details_harness|winogrande|5_2024-03-22T12-17-08.484037.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T12-17-08.484037.parquet'
- config_name: results
data_files:
- split: 2024_03_22T12_17_08.484037
path:
- results_2024-03-22T12-17-08.484037.parquet
- split: latest
path:
- results_2024-03-22T12-17-08.484037.parquet
---
# Dataset Card for Evaluation run of mlabonne/Beyonder-4x7B-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/Beyonder-4x7B-v3](https://huggingface.co/mlabonne/Beyonder-4x7B-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__Beyonder-4x7B-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T12:17:08.484037](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__Beyonder-4x7B-v3/blob/main/results_2024-03-22T12-17-08.484037.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6545651435392091,
"acc_stderr": 0.0321355052869314,
"acc_norm": 0.6539201029549403,
"acc_norm_stderr": 0.032807725306125385,
"mc1": 0.5887392900856793,
"mc1_stderr": 0.017225627083660874,
"mc2": 0.7444087204884349,
"mc2_stderr": 0.014280260468011773
},
"harness|arc:challenge|25": {
"acc": 0.697098976109215,
"acc_stderr": 0.013428241573185349,
"acc_norm": 0.7167235494880546,
"acc_norm_stderr": 0.013167478735134575
},
"harness|hellaswag|10": {
"acc": 0.7144991037641903,
"acc_stderr": 0.004507296196227808,
"acc_norm": 0.888568014339773,
"acc_norm_stderr": 0.0031402323925687967
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438665,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438665
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659807,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659807
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250437,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4111731843575419,
"acc_stderr": 0.016456498033977512,
"acc_norm": 0.4111731843575419,
"acc_norm_stderr": 0.016456498033977512
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182653,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182653
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015057,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015057
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5887392900856793,
"mc1_stderr": 0.017225627083660874,
"mc2": 0.7444087204884349,
"mc2_stderr": 0.014280260468011773
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.010430917468237424
},
"harness|gsm8k|5": {
"acc": 0.7058377558756633,
"acc_stderr": 0.012551285331470152
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
hkjoe0210/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: labels_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: description
dtype: string
- name: creator
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: open_issues
dtype: int64
- name: closed_issues
dtype: int64
- name: state
dtype: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: due_on
dtype: 'null'
- name: closed_at
dtype: 'null'
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 12862548
num_examples: 1000
download_size: 3509115
dataset_size: 12862548
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mnoukhov/results | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: generations
dtype: string
splits:
- name: train
num_bytes: 162173
num_examples: 100
download_size: 104621
dataset_size: 162173
---
# Dataset Card for "results"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlirezaGhasri/Stable | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 435147.0
num_examples: 7
download_size: 418910
dataset_size: 435147.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SoAp9035/turkish_instructions | ---
license: apache-2.0
language:
- tr
---
# Turkish Instructions
## Apache 2.0
This dataset is a cleaned and organized version (for Mistral) of [afkfatih/turkishdataset](https://huggingface.co/datasets/afkfatih/turkishdataset)
|
arthurmluz/xlsum_data-xlsum_cstnews_results | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 27891191
num_examples: 7175
download_size: 17065662
dataset_size: 27891191
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "xlsum_data-xlsumm_cstnews_results"
rouge={'rouge1': 0.28092424335188093, 'rouge2': 0.09690108337371167, 'rougeL': 0.18905486438287647, 'rougeLsum': 0.18905486438287647}
Bert={'precision': 0.696474292253368, 'recall': 0.7445932821861958, 'f1': 0.7192442754004476}
mover = 0.596707339115228 |
joey234/mmlu-high_school_government_and_politics | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
splits:
- name: dev
num_bytes: 4913
num_examples: 5
- name: test
num_bytes: 867595
num_examples: 193
download_size: 106889
dataset_size: 872508
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-high_school_government_and_politics"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
akshay14/CAD_images_BLIP_small | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2259005.0
num_examples: 38
download_size: 2126525
dataset_size: 2259005.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/miyauchi_kazuho_nonnonbiyori | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Miyauchi Kazuho
This is the dataset of Miyauchi Kazuho, containing 172 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 172 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 411 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 427 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 172 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 172 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 172 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 411 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 411 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 332 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 427 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 427 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
Sakil/DocuBotMultiPDFConversationalAssistant | ---
license: apache-2.0
---
|
CyberHarem/emma_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of emma/エマ/艾玛/엠마 (Nikke: Goddess of Victory)
This is the dataset of emma/エマ/艾玛/엠마 (Nikke: Goddess of Victory), containing 63 images and their tags.
The core tags of this character are `long_hair, breasts, hat, bangs, large_breasts, yellow_eyes, beret, blonde_hair, brown_hair, huge_breasts, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 63 | 101.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/emma_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 63 | 51.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/emma_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 153 | 107.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/emma_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 63 | 85.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/emma_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 153 | 162.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/emma_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/emma_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 28 |  |  |  |  |  | 1girl, solo, looking_at_viewer, smile, skirt, pantyhose, open_mouth, fingerless_gloves, green_eyes, black_headwear, long_sleeves, thighs, black_gloves, jewelry, shirt, blush, cleavage, necktie |
| 1 | 7 |  |  |  |  |  | christmas, looking_at_viewer, santa_hat, 1girl, bare_shoulders, cleavage, solo, white_thighhighs, red_dress, jewelry, open_mouth, santa_costume, fur_trim, holding, sitting, smile, thick_thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | smile | skirt | pantyhose | open_mouth | fingerless_gloves | green_eyes | black_headwear | long_sleeves | thighs | black_gloves | jewelry | shirt | blush | cleavage | necktie | christmas | santa_hat | bare_shoulders | white_thighhighs | red_dress | santa_costume | fur_trim | holding | sitting | thick_thighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:--------|:------------|:-------------|:--------------------|:-------------|:-----------------|:---------------|:---------|:---------------|:----------|:--------|:--------|:-----------|:----------|:------------|:------------|:-----------------|:-------------------|:------------|:----------------|:-----------|:----------|:----------|:---------------|
| 0 | 28 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | | | X | | | | | | | X | | | X | | X | X | X | X | X | X | X | X | X | X |
|
matejklemen/nucle | ---
license: other
dataset_info:
- config_name: public
features:
- name: src_tokens
sequence: string
- name: tgt_tokens
sequence: string
- name: corrections
list:
- name: idx_src
sequence: int32
- name: idx_tgt
sequence: int32
- name: corr_type
dtype: string
splits:
- name: train
download_size: 0
dataset_size: 0
- config_name: private
features:
- name: src_tokens
sequence: string
- name: tgt_tokens
sequence: string
- name: corrections
list:
- name: idx_src
sequence: int32
- name: idx_tgt
sequence: int32
- name: corr_type
dtype: string
splits:
- name: train
download_size: 0
dataset_size: 0
---
**Important**: This is only a script for loading the data, but the data itself is private. The script will only work in case you have access to the data, which you may request for non-commercial purposes [here](https://sterling8.d2.comp.nus.edu.sg/nucle_download/nucle.php).
```python
data = datasets.load_dataset("matejklemen/nucle", "private", data_dir=<dir-of-private-data>, ignore_verifications=True)"
```
The `ignore_verifications=True` is important as the datasets library initially builds validation statistics that it verifies against,
and these cannot be correctly computed when the data is not public.
|
5w4n/OSCAR-2019-Burmese-fix | ---
pretty_name: OSCAR-2019-Burmese-fix
annotations_creators:
- no-annotation
configs:
- unshuffled_deduplicated_cleaned_my
language:
- my
language_creators:
- found
license:
- cc0-1.0
multilinguality:
- monolingual
paperswithcode_id: oscar
size_categories:
- 100K<n<1M
source_datasets:
- extended|oscar
tags:
- burmese
- myanmar
- myanmar-news
- myanmar-corpus
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
---
# Dataset Card for OSCAR-2019-Burmese-fix
## Dataset Description
This dataset is a cleand version of Myanmar language in OSCAR 2019 dataset.
### Contributions
[Swan Htet Aung](https://github.com/swanhtet1992)
|
open-llm-leaderboard/details_cerebras__Cerebras-GPT-2.7B | ---
pretty_name: Evaluation run of cerebras/Cerebras-GPT-2.7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cerebras/Cerebras-GPT-2.7B](https://huggingface.co/cerebras/Cerebras-GPT-2.7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cerebras__Cerebras-GPT-2.7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T22:31:27.618603](https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-2.7B/blob/main/results_2023-10-15T22-31-27.618603.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.00033145814652192537,\n \"f1\": 0.045849412751678,\n\
\ \"f1_stderr\": 0.0011802883893565243,\n \"acc\": 0.27299268238536645,\n\
\ \"acc_stderr\": 0.007928850948897767\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652192537,\n\
\ \"f1\": 0.045849412751678,\n \"f1_stderr\": 0.0011802883893565243\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.004548900682335102,\n \
\ \"acc_stderr\": 0.0018535550440036204\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5414364640883977,\n \"acc_stderr\": 0.014004146853791914\n\
\ }\n}\n```"
repo_url: https://huggingface.co/cerebras/Cerebras-GPT-2.7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T22_31_27.618603
path:
- '**/details_harness|drop|3_2023-10-15T22-31-27.618603.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T22-31-27.618603.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T22_31_27.618603
path:
- '**/details_harness|gsm8k|5_2023-10-15T22-31-27.618603.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T22-31-27.618603.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:27:41.831056.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:27:41.831056.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T16:27:41.831056.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T22_31_27.618603
path:
- '**/details_harness|winogrande|5_2023-10-15T22-31-27.618603.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T22-31-27.618603.parquet'
- config_name: results
data_files:
- split: 2023_07_19T16_27_41.831056
path:
- results_2023-07-19T16:27:41.831056.parquet
- split: 2023_10_15T22_31_27.618603
path:
- results_2023-10-15T22-31-27.618603.parquet
- split: latest
path:
- results_2023-10-15T22-31-27.618603.parquet
---
# Dataset Card for Evaluation run of cerebras/Cerebras-GPT-2.7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/cerebras/Cerebras-GPT-2.7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [cerebras/Cerebras-GPT-2.7B](https://huggingface.co/cerebras/Cerebras-GPT-2.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cerebras__Cerebras-GPT-2.7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T22:31:27.618603](https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-2.7B/blob/main/results_2023-10-15T22-31-27.618603.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652192537,
"f1": 0.045849412751678,
"f1_stderr": 0.0011802883893565243,
"acc": 0.27299268238536645,
"acc_stderr": 0.007928850948897767
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652192537,
"f1": 0.045849412751678,
"f1_stderr": 0.0011802883893565243
},
"harness|gsm8k|5": {
"acc": 0.004548900682335102,
"acc_stderr": 0.0018535550440036204
},
"harness|winogrande|5": {
"acc": 0.5414364640883977,
"acc_stderr": 0.014004146853791914
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
vietgpt/orca_en | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
- name: meta
struct:
- name: subset
dtype: string
splits:
- name: train
num_bytes: 6194081932
num_examples: 3601717
- name: test
num_bytes: 1093059093
num_examples: 635599
download_size: 3534002711
dataset_size: 7287141025
---
# Dataset Card for "orca_en"
```python
def preprocess(
sample,
instruction_key="### Instruction:",
response_key="<|endofprompt|>",
end_key="<|endoftext|>"
):
system_prompt = sample['system_prompt']
instruction = sample['question']
response = sample['response']
if system_prompt:
return {'text': """{system_prompt}
{instruction_key}
{instruction}
{response_key}
{response}
{end_key}""".format(
system_prompt=system_prompt,
instruction_key=instruction_key,
instruction=instruction,
response_key=response_key,
response=response,
end_key=end_key,
)}
else:
return {'text': """{instruction_key}
{instruction}
{response_key}
{response}
{end_key}""".format(
instruction_key=instruction_key,
instruction=instruction,
response_key=response_key,
response=response,
end_key=end_key,
)}
"""
You are an AI assistant. Provide a detailed answer so user don’t need to search outside to understand the answer.
### Instruction:
Q: Answer the following question given this paragraph: The kidneys also secrete hormones that help maintain homeostasis. For example, they produce a hormone that stimulates bone marrow to produce red blood cells when more are needed. They also secrete a hormone that regulates blood pressure and keeps it in a normal range. Q: What organs secrete hormones that help maintain homeostasis? A:
The answer is:
<|endofprompt|>
The kidneys are the organs that secrete hormones to help maintain homeostasis. They produce a hormone that stimulates bone marrow to produce red blood cells when needed, and they also secrete a hormone that regulates blood pressure, keeping it within a normal range.
<|endoftext|>
"""
``` |
open-llm-leaderboard/details_Intel__neural-chat-7b-v3-3 | ---
pretty_name: Evaluation run of Intel/neural-chat-7b-v3-3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Intel/neural-chat-7b-v3-3](https://huggingface.co/Intel/neural-chat-7b-v3-3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Intel__neural-chat-7b-v3-3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T20:33:34.862293](https://huggingface.co/datasets/open-llm-leaderboard/details_Intel__neural-chat-7b-v3-3/blob/main/results_2023-12-09T20-33-34.862293.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.633718840288445,\n\
\ \"acc_stderr\": 0.03262856399270551,\n \"acc_norm\": 0.6351165946232198,\n\
\ \"acc_norm_stderr\": 0.03329008839330021,\n \"mc1\": 0.4700122399020808,\n\
\ \"mc1_stderr\": 0.017471992091697534,\n \"mc2\": 0.6301479198844473,\n\
\ \"mc2_stderr\": 0.015176409746133967\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6373720136518771,\n \"acc_stderr\": 0.014049106564955007,\n\
\ \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.013752062419817837\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6617207727544314,\n\
\ \"acc_stderr\": 0.004721571443354415,\n \"acc_norm\": 0.8526190001991635,\n\
\ \"acc_norm_stderr\": 0.0035376085010691773\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.0420392104015628,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.0420392104015628\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958546,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958546\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.032500536843658404,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.032500536843658404\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520193,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520193\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n\
\ \"acc_stderr\": 0.02468597928623996,\n \"acc_norm\": 0.7483870967741936,\n\
\ \"acc_norm_stderr\": 0.02468597928623996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091826,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091826\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.02912652283458682,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.02912652283458682\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.02423353229775873,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.02423353229775873\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \
\ \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976044,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976044\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.016384638410380823,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.016384638410380823\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.02970045324729146,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.02970045324729146\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43546284224250326,\n\
\ \"acc_stderr\": 0.01266341210124834,\n \"acc_norm\": 0.43546284224250326,\n\
\ \"acc_norm_stderr\": 0.01266341210124834\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \
\ \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4700122399020808,\n\
\ \"mc1_stderr\": 0.017471992091697534,\n \"mc2\": 0.6301479198844473,\n\
\ \"mc2_stderr\": 0.015176409746133967\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626913\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6110689916603488,\n \
\ \"acc_stderr\": 0.013428382481274231\n }\n}\n```"
repo_url: https://huggingface.co/Intel/neural-chat-7b-v3-3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|arc:challenge|25_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|gsm8k|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hellaswag|10_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T20-33-34.862293.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T20-33-34.862293.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- '**/details_harness|winogrande|5_2023-12-09T20-33-34.862293.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T20-33-34.862293.parquet'
- config_name: results
data_files:
- split: 2023_12_09T20_33_34.862293
path:
- results_2023-12-09T20-33-34.862293.parquet
- split: latest
path:
- results_2023-12-09T20-33-34.862293.parquet
---
# Dataset Card for Evaluation run of Intel/neural-chat-7b-v3-3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Intel/neural-chat-7b-v3-3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Intel/neural-chat-7b-v3-3](https://huggingface.co/Intel/neural-chat-7b-v3-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Intel__neural-chat-7b-v3-3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T20:33:34.862293](https://huggingface.co/datasets/open-llm-leaderboard/details_Intel__neural-chat-7b-v3-3/blob/main/results_2023-12-09T20-33-34.862293.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.633718840288445,
"acc_stderr": 0.03262856399270551,
"acc_norm": 0.6351165946232198,
"acc_norm_stderr": 0.03329008839330021,
"mc1": 0.4700122399020808,
"mc1_stderr": 0.017471992091697534,
"mc2": 0.6301479198844473,
"mc2_stderr": 0.015176409746133967
},
"harness|arc:challenge|25": {
"acc": 0.6373720136518771,
"acc_stderr": 0.014049106564955007,
"acc_norm": 0.6689419795221843,
"acc_norm_stderr": 0.013752062419817837
},
"harness|hellaswag|10": {
"acc": 0.6617207727544314,
"acc_stderr": 0.004721571443354415,
"acc_norm": 0.8526190001991635,
"acc_norm_stderr": 0.0035376085010691773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.0420392104015628,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.0420392104015628
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958546,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958546
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.032500536843658404,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.032500536843658404
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520193,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520193
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.02468597928623996,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.02468597928623996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091826,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091826
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.02423353229775873,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.02423353229775873
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976044,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976044
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4,
"acc_stderr": 0.016384638410380823,
"acc_norm": 0.4,
"acc_norm_stderr": 0.016384638410380823
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.02970045324729146,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.02970045324729146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43546284224250326,
"acc_stderr": 0.01266341210124834,
"acc_norm": 0.43546284224250326,
"acc_norm_stderr": 0.01266341210124834
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801302,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4700122399020808,
"mc1_stderr": 0.017471992091697534,
"mc2": 0.6301479198844473,
"mc2_stderr": 0.015176409746133967
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626913
},
"harness|gsm8k|5": {
"acc": 0.6110689916603488,
"acc_stderr": 0.013428382481274231
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
HEN10/doc_train | ---
license: openrail
---
|
open-llm-leaderboard/details_Kukedlc__NeuralKuke-4-All-7b | ---
pretty_name: Evaluation run of Kukedlc/NeuralKuke-4-All-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kukedlc/NeuralKuke-4-All-7b](https://huggingface.co/Kukedlc/NeuralKuke-4-All-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kukedlc__NeuralKuke-4-All-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-06T18:36:08.064827](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__NeuralKuke-4-All-7b/blob/main/results_2024-04-06T18-36-08.064827.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6532718357049047,\n\
\ \"acc_stderr\": 0.032068009628509475,\n \"acc_norm\": 0.6522890950271336,\n\
\ \"acc_norm_stderr\": 0.03274364946737568,\n \"mc1\": 0.6070991432068543,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.7538380741680661,\n\
\ \"mc2_stderr\": 0.014166454648098353\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7107508532423208,\n \"acc_stderr\": 0.013250012579393441,\n\
\ \"acc_norm\": 0.734641638225256,\n \"acc_norm_stderr\": 0.012902554762313962\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7173869747062338,\n\
\ \"acc_stderr\": 0.004493495872000108,\n \"acc_norm\": 0.8900617406891057,\n\
\ \"acc_norm_stderr\": 0.003121734839569858\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993466,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993466\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4402234636871508,\n\
\ \"acc_stderr\": 0.01660256461504994,\n \"acc_norm\": 0.4402234636871508,\n\
\ \"acc_norm_stderr\": 0.01660256461504994\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042117,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042117\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4745762711864407,\n\
\ \"acc_stderr\": 0.012753716929101008,\n \"acc_norm\": 0.4745762711864407,\n\
\ \"acc_norm_stderr\": 0.012753716929101008\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6070991432068543,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.7538380741680661,\n\
\ \"mc2_stderr\": 0.014166454648098353\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.010099208246065604\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7134192570128886,\n \
\ \"acc_stderr\": 0.012454841668337694\n }\n}\n```"
repo_url: https://huggingface.co/Kukedlc/NeuralKuke-4-All-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|arc:challenge|25_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|gsm8k|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hellaswag|10_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T18-36-08.064827.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T18-36-08.064827.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- '**/details_harness|winogrande|5_2024-04-06T18-36-08.064827.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-06T18-36-08.064827.parquet'
- config_name: results
data_files:
- split: 2024_04_06T18_36_08.064827
path:
- results_2024-04-06T18-36-08.064827.parquet
- split: latest
path:
- results_2024-04-06T18-36-08.064827.parquet
---
# Dataset Card for Evaluation run of Kukedlc/NeuralKuke-4-All-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kukedlc/NeuralKuke-4-All-7b](https://huggingface.co/Kukedlc/NeuralKuke-4-All-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kukedlc__NeuralKuke-4-All-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-06T18:36:08.064827](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__NeuralKuke-4-All-7b/blob/main/results_2024-04-06T18-36-08.064827.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6532718357049047,
"acc_stderr": 0.032068009628509475,
"acc_norm": 0.6522890950271336,
"acc_norm_stderr": 0.03274364946737568,
"mc1": 0.6070991432068543,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.7538380741680661,
"mc2_stderr": 0.014166454648098353
},
"harness|arc:challenge|25": {
"acc": 0.7107508532423208,
"acc_stderr": 0.013250012579393441,
"acc_norm": 0.734641638225256,
"acc_norm_stderr": 0.012902554762313962
},
"harness|hellaswag|10": {
"acc": 0.7173869747062338,
"acc_stderr": 0.004493495872000108,
"acc_norm": 0.8900617406891057,
"acc_norm_stderr": 0.003121734839569858
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139403,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139403
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652457,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652457
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993466,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993466
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4402234636871508,
"acc_stderr": 0.01660256461504994,
"acc_norm": 0.4402234636871508,
"acc_norm_stderr": 0.01660256461504994
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042117,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042117
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4745762711864407,
"acc_stderr": 0.012753716929101008,
"acc_norm": 0.4745762711864407,
"acc_norm_stderr": 0.012753716929101008
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6070991432068543,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.7538380741680661,
"mc2_stderr": 0.014166454648098353
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.010099208246065604
},
"harness|gsm8k|5": {
"acc": 0.7134192570128886,
"acc_stderr": 0.012454841668337694
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
malucoelhaofc/VocalLivreV2 | ---
license: openrail
---
|
parhasard/eir-alpaca-type | ---
license: mit
---
|
nev/aitch-dataset | ---
license: other
---
|
open-llm-leaderboard/details_openchat__openchat_v2 | ---
pretty_name: Evaluation run of openchat/openchat_v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openchat/openchat_v2](https://huggingface.co/openchat/openchat_v2) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openchat__openchat_v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T23:33:59.473281](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v2/blob/main/results_2023-10-18T23-33-59.473281.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.0004191330178826953,\n \"f1\": 0.06369546979865812,\n\
\ \"f1_stderr\": 0.0013881754743750058,\n \"acc\": 0.4267044764366107,\n\
\ \"acc_stderr\": 0.009941310874908384\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826953,\n\
\ \"f1\": 0.06369546979865812,\n \"f1_stderr\": 0.0013881754743750058\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09097801364670205,\n \
\ \"acc_stderr\": 0.007921322844013628\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803141\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openchat/openchat_v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|arc:challenge|25_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T23_33_59.473281
path:
- '**/details_harness|drop|3_2023-10-18T23-33-59.473281.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T23-33-59.473281.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T23_33_59.473281
path:
- '**/details_harness|gsm8k|5_2023-10-18T23-33-59.473281.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T23-33-59.473281.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hellaswag|10_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T16:15:43.375202.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T16:15:43.375202.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T16:15:43.375202.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T23_33_59.473281
path:
- '**/details_harness|winogrande|5_2023-10-18T23-33-59.473281.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T23-33-59.473281.parquet'
- config_name: results
data_files:
- split: 2023_07_24T16_15_43.375202
path:
- results_2023-07-24T16:15:43.375202.parquet
- split: 2023_10_18T23_33_59.473281
path:
- results_2023-10-18T23-33-59.473281.parquet
- split: latest
path:
- results_2023-10-18T23-33-59.473281.parquet
---
# Dataset Card for Evaluation run of openchat/openchat_v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openchat/openchat_v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openchat/openchat_v2](https://huggingface.co/openchat/openchat_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openchat__openchat_v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T23:33:59.473281](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v2/blob/main/results_2023-10-18T23-33-59.473281.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826953,
"f1": 0.06369546979865812,
"f1_stderr": 0.0013881754743750058,
"acc": 0.4267044764366107,
"acc_stderr": 0.009941310874908384
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826953,
"f1": 0.06369546979865812,
"f1_stderr": 0.0013881754743750058
},
"harness|gsm8k|5": {
"acc": 0.09097801364670205,
"acc_stderr": 0.007921322844013628
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803141
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
KETI-AIR/aihub_paper_summarization | ---
license: apache-2.0
---
|
result-kand2-sdxl-wuerst-karlo/74441c7b | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 178
num_examples: 10
download_size: 1358
dataset_size: 178
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "74441c7b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
VietnamAIHub/Vietnamese_Dolly_15k | ---
license: cc-by-3.0
---
|
Obreyer/freddy | ---
license: openrail
---
|
luizlzg/llm_geral | ---
license: apache-2.0
task_categories:
- text-generation
language:
- pt
size_categories:
- 10K<n<100K
configs:
- config_name: default
data_files:
- split: train
path: llm_geral_treino*
- split: validation
path: llm_geral_valid*
---
|
CyberHarem/houraisan_kaguya_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of houraisan_kaguya/蓬莱山輝夜/호라이산카구야 (Touhou)
This is the dataset of houraisan_kaguya/蓬莱山輝夜/호라이산카구야 (Touhou), containing 500 images and their tags.
The core tags of this character are `long_hair, black_hair, very_long_hair, bow, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 631.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houraisan_kaguya_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 410.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houraisan_kaguya_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1026 | 722.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houraisan_kaguya_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 576.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houraisan_kaguya_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1026 | 939.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houraisan_kaguya_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/houraisan_kaguya_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, hime_cut, long_sleeves, pink_shirt, red_skirt, solo, wide_sleeves, blunt_bangs, full_moon, looking_at_viewer, white_bowtie, closed_mouth, frilled_shirt_collar, smile, jeweled_branch_of_hourai, night_sky, sleeves_past_wrists, bamboo, frilled_sleeves, holding |
| 1 | 9 |  |  |  |  |  | 1girl, full_moon, long_sleeves, shirt, skirt, solo, wide_sleeves, looking_at_viewer, smile, jeweled_branch_of_hourai, night_sky, starry_sky, bamboo |
| 2 | 10 |  |  |  |  |  | 1girl, jeweled_branch_of_hourai, solo, full_moon, japanese_clothes, skirt, wide_sleeves, smile |
| 3 | 7 |  |  |  |  |  | 1girl, jeweled_branch_of_hourai, solo, wide_sleeves, smile, japanese_clothes, simple_background, white_background |
| 4 | 6 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, skirt, smile, solo, wide_sleeves, full_moon, shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hime_cut | long_sleeves | pink_shirt | red_skirt | solo | wide_sleeves | blunt_bangs | full_moon | looking_at_viewer | white_bowtie | closed_mouth | frilled_shirt_collar | smile | jeweled_branch_of_hourai | night_sky | sleeves_past_wrists | bamboo | frilled_sleeves | holding | shirt | skirt | starry_sky | japanese_clothes | simple_background | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:---------------|:-------------|:------------|:-------|:---------------|:--------------|:------------|:--------------------|:---------------|:---------------|:-----------------------|:--------|:---------------------------|:------------|:----------------------|:---------|:------------------|:----------|:--------|:--------|:-------------|:-------------------|:--------------------|:-------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 9 |  |  |  |  |  | X | | X | | | X | X | | X | X | | | | X | X | X | | X | | | X | X | X | | | |
| 2 | 10 |  |  |  |  |  | X | | | | | X | X | | X | | | | | X | X | | | | | | | X | | X | | |
| 3 | 7 |  |  |  |  |  | X | | | | | X | X | | | | | | | X | X | | | | | | | | | X | X | X |
| 4 | 6 |  |  |  |  |  | X | | X | | | X | X | | X | X | | | | X | | | | | | | X | X | | | | |
|
CyberHarem/ueda_suzuho_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ueda_suzuho/上田鈴帆 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of ueda_suzuho/上田鈴帆 (THE iDOLM@STER: Cinderella Girls), containing 50 images and their tags.
The core tags of this character are `short_hair, brown_hair, brown_eyes, ahoge`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 50 | 33.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ueda_suzuho_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 50 | 28.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ueda_suzuho_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 96 | 48.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ueda_suzuho_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 50 | 32.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ueda_suzuho_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 96 | 54.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ueda_suzuho_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ueda_suzuho_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, looking_at_viewer, simple_background, white_background, open_mouth, sweat, grin |
| 1 | 17 |  |  |  |  |  | 1girl, character_name, solo, card_(medium), sun_symbol, smile, costume, orange_background, star_(symbol), open_mouth, red_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | simple_background | white_background | open_mouth | sweat | grin | character_name | card_(medium) | sun_symbol | smile | costume | orange_background | star_(symbol) | red_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------------------|:-------------------|:-------------|:--------|:-------|:-----------------|:----------------|:-------------|:--------|:----------|:--------------------|:----------------|:-----------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | |
| 1 | 17 |  |  |  |  |  | X | X | | | | X | | | X | X | X | X | X | X | X | X |
|
evilback/TATR | ---
license: openrail
---
|
arieg/bw_spec_cls_4_00 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '10'
'1': '140'
'2': '2'
'3': '5'
splits:
- name: train
num_bytes: 21844211.0
num_examples: 400
- name: test
num_bytes: 4370837.0
num_examples: 80
download_size: 26107310
dataset_size: 26215048.0
---
# Dataset Card for "bw_spec_cls_4_00"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlanYky/offensive-no-instruction-with-symbol | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 2719611
num_examples: 2000
download_size: 1463462
dataset_size: 2719611
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hoangphu7122002ai/expression_query_sql | ---
dataset_info:
features:
- name: info_map_field
sequence: string
- name: info_choose
sequence: string
- name: field_choose
sequence: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 19943421
num_examples: 17441
download_size: 8675795
dataset_size: 19943421
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-87000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1045274
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
codeparrot/conala-mined-curated | ---
dataset_info:
features:
- name: question_id
dtype: int64
- name: parent_answer_post_id
dtype: int64
- name: prob
dtype: float64
- name: snippet
dtype: string
- name: intent
dtype: string
- name: rewritten_intent
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 136332874
num_examples: 593891
download_size: 94688053
dataset_size: 136332874
---
# Conala-mined-curated
Conala-mined-curatedd is a dataset that is based on the mined subset of the [CoNaLa dataset](https://huggingface.co/datasets/neulab/conala/viewer/mined/train).
conala is a dataset crawled from Stack Overflow. Part of it is filtered and curated to from a training set and a test set. However, the mined part is not comparably
post-processed. It is a set of 600K examples that we decided to work on.
## Dataset description
The conala datasets have 3 columns of interest. We give their description as provided by the [authors](https://conala-corpus.github.io)
- *intent* : Natural Language intent (i.e., the title of a Stack Overflow question)
- *snippet* : A code snippet that implements the intent. This is the output of systems in the challenge.
- *rewritten_intent* : Crowdsourced revised intents that try to better reflect the full meaning of the code, typically done by incorporating variable names and
- function arguments that appeared in the code into the intent. This is the input to be used by systems in the CoNaLa challenge.
For instruction fine-tuning, we would be interested to train a model to map the *rewritten_intent* to the *snippet*. However, the mined subset does not have the
column *rewritten_intent*. *intent* is to vague to be describe as an instruction so we have to find a way to build the column *rewritten_intent* for the mined subset.
That is exactly what was done in order to build this dataset.
## Method
The most valuable information that we have in order to recover the column *rewritten_intent* are the columns *intent* and *snippet*. Fortunately we also have the training set and the test set
of conala which are labeled. It means that we have a view of what a high quality triplet (*intent*, *rewritten_intent*, *snippet*) look like. We had the idea to build a Seq2Seq model whose role
would be to reconstruct the *rewritten_intent* based on the concatenation [*intent*, *snippet*].
More precisely, we fine-tuned [google UL2](https://huggingface.co/google/ul2) to solve this task.
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("codeparrot/conala-mined-curated")
dataset
DatasetDict({
train: Dataset({
features: ['question_id', 'parent_answer_post_id', 'prob', 'snippet', 'intent', 'rewritten_intent', 'id'],
num_rows: 593891
})
})
```
## Additional resources
- Official site of the [CoNala-corpus](https://conala-corpus.github.io).
- [CoNaLa's card](https://huggingface.co/datasets/neulab/conala).
- [Github repository](https://github.com/ArmelRandy/Conala) of our method.
|
keylazy/ark-raw | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text1
dtype: string
- name: text2
dtype: string
splits:
- name: train
num_bytes: 274489671
num_examples: 1000000
- name: test
num_bytes: 27481428
num_examples: 100000
download_size: 189424610
dataset_size: 301971099
---
# Dataset Card for "ark-raw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
levuloihust/vien-corpus-for-tokenizer | ---
language:
- vi
- en
---
# Description
The file `vien-corpus.txt` has 7957186 lines, mixing Vietnamese and English texts. This file was used as a corpus to train a tokenizer for both Vietnamese and English.
# Details
## How was this corpus created?
1. English text is from Wikipedia
2. Vietnamese text is from 2 sources:
* Crawled data from news websites
* Oscar dataset |
hk-kaden-kim/pix2struct-chartcaptioning | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
- name: chartType
dtype: string
- name: chartElement
dtype: string
- name: dataType
dtype: string
splits:
- name: train
num_bytes: 72690622.389
num_examples: 1373
- name: test
num_bytes: 57043770.8
num_examples: 1200
- name: validation
num_bytes: 10174935.0
num_examples: 199
download_size: 0
dataset_size: 139909328.189
---
# Dataset Card for "pix2struct-chartcaptioning"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Katrg/PolinaTinyBunny | ---
license: creativeml-openrail-m
tags:
- lora
- aiartchan
- stable-diffusion
- art
language:
- en
pretty_name: Polina
---
## LoRA Description
LoRA which will allow you to create images with Polina from the game Tiny Bunny.
## Weights:
I recommend using a weight of 0.57 for the best generation, but you can try experimenting with weights from 0.4 to 0.7.
## About trigger words:
PolinaBlackWhite - Trained on black and white images
PolinaColor - Trained on color images
Enjoy using it!
### CivitAi: https://civitai.com/models/84165/polina-tiny-bunny
## Example images
 |
viewit-ai/sobha | ---
language:
- en
--- |
qiyuw/wspalign_acl2023_eval | ---
license: cc-by-nc-sa-4.0
---
|
open-llm-leaderboard/details_HuggingFaceH4__zephyr-7b-beta | ---
pretty_name: Evaluation run of HuggingFaceH4/zephyr-7b-beta
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [HuggingFaceH4/zephyr-7b-beta](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HuggingFaceH4__zephyr-7b-beta\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T23:27:56.473641](https://huggingface.co/datasets/open-llm-leaderboard/details_HuggingFaceH4__zephyr-7b-beta/blob/main/results_2023-12-04T23-27-56.473641.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6046654337307571,\n\
\ \"acc_stderr\": 0.03331208745152503,\n \"acc_norm\": 0.6113529654673323,\n\
\ \"acc_norm_stderr\": 0.034010916290269214,\n \"mc1\": 0.4222766217870257,\n\
\ \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.5783301386651128,\n\
\ \"mc2_stderr\": 0.01580070269822175\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.0143610972884497,\n\
\ \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.014150631435111728\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6488747261501693,\n\
\ \"acc_stderr\": 0.004763465139038561,\n \"acc_norm\": 0.8434574785899224,\n\
\ \"acc_norm_stderr\": 0.0036262628054422106\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.029224526469124792,\n\
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.029224526469124792\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.0250107491161376,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.0250107491161376\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n\
\ \"acc_stderr\": 0.02479011845933221,\n \"acc_norm\": 0.7451612903225806,\n\
\ \"acc_norm_stderr\": 0.02479011845933221\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365907,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365907\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316453,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316453\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878934,\n\
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878934\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203627,\n \"\
acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203627\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375798,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375798\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842538,\n \
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842538\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n\
\ \"acc_stderr\": 0.014743125394823297,\n \"acc_norm\": 0.7828863346104725,\n\
\ \"acc_norm_stderr\": 0.014743125394823297\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.02494679222527231,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.02494679222527231\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33519553072625696,\n\
\ \"acc_stderr\": 0.015788007190185884,\n \"acc_norm\": 0.33519553072625696,\n\
\ \"acc_norm_stderr\": 0.015788007190185884\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281413,\n\
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281413\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.02673062072800491,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.02673062072800491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.02651759772446501,\n\
\ \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.02651759772446501\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4276401564537158,\n\
\ \"acc_stderr\": 0.012635799922765844,\n \"acc_norm\": 0.4276401564537158,\n\
\ \"acc_norm_stderr\": 0.012635799922765844\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n\
\ \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6274509803921569,\n \"acc_stderr\": 0.019559646809215927,\n \
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.019559646809215927\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n\
\ \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768917,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768917\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4222766217870257,\n\
\ \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.5783301386651128,\n\
\ \"mc2_stderr\": 0.01580070269822175\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025397\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.27065959059893857,\n \
\ \"acc_stderr\": 0.012238245006183405\n }\n}\n```"
repo_url: https://huggingface.co/HuggingFaceH4/zephyr-7b-beta
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|arc:challenge|25_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|arc:challenge|25_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|arc:challenge|25_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|drop|3_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|drop|3_2023-11-18T22-22-30.225929.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-18T22-22-30.225929.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|gsm8k|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|gsm8k|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|gsm8k|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hellaswag|10_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hellaswag|10_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hellaswag|10_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T22-09-56.084449.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T22-22-30.225929.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T23-27-56.473641.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T23-27-56.473641.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- '**/details_harness|winogrande|5_2023-11-18T22-09-56.084449.parquet'
- split: 2023_11_18T22_22_30.225929
path:
- '**/details_harness|winogrande|5_2023-11-18T22-22-30.225929.parquet'
- split: 2023_12_04T23_27_56.473641
path:
- '**/details_harness|winogrande|5_2023-12-04T23-27-56.473641.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T23-27-56.473641.parquet'
- config_name: results
data_files:
- split: 2023_11_18T22_09_56.084449
path:
- results_2023-11-18T22-09-56.084449.parquet
- split: 2023_11_18T22_22_30.225929
path:
- results_2023-11-18T22-22-30.225929.parquet
- split: 2023_12_04T23_27_56.473641
path:
- results_2023-12-04T23-27-56.473641.parquet
- split: latest
path:
- results_2023-12-04T23-27-56.473641.parquet
---
# Dataset Card for Evaluation run of HuggingFaceH4/zephyr-7b-beta
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/HuggingFaceH4/zephyr-7b-beta
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [HuggingFaceH4/zephyr-7b-beta](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HuggingFaceH4__zephyr-7b-beta",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T23:27:56.473641](https://huggingface.co/datasets/open-llm-leaderboard/details_HuggingFaceH4__zephyr-7b-beta/blob/main/results_2023-12-04T23-27-56.473641.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6046654337307571,
"acc_stderr": 0.03331208745152503,
"acc_norm": 0.6113529654673323,
"acc_norm_stderr": 0.034010916290269214,
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.5783301386651128,
"mc2_stderr": 0.01580070269822175
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.0143610972884497,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.014150631435111728
},
"harness|hellaswag|10": {
"acc": 0.6488747261501693,
"acc_stderr": 0.004763465139038561,
"acc_norm": 0.8434574785899224,
"acc_norm_stderr": 0.0036262628054422106
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.029224526469124792,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.029224526469124792
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.0250107491161376,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.0250107491161376
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.02479011845933221,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.02479011845933221
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365907,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365907
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.02717121368316453,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.02717121368316453
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878934,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878934
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.016785481159203627,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.016785481159203627
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375798,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375798
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842538,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842538
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7828863346104725,
"acc_stderr": 0.014743125394823297,
"acc_norm": 0.7828863346104725,
"acc_norm_stderr": 0.014743125394823297
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.02494679222527231,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.02494679222527231
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33519553072625696,
"acc_stderr": 0.015788007190185884,
"acc_norm": 0.33519553072625696,
"acc_norm_stderr": 0.015788007190185884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.026857294663281413,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.026857294663281413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.02673062072800491,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.02673062072800491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4276401564537158,
"acc_stderr": 0.012635799922765844,
"acc_norm": 0.4276401564537158,
"acc_norm_stderr": 0.012635799922765844
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.019559646809215927,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.019559646809215927
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768917,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768917
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.5783301386651128,
"mc2_stderr": 0.01580070269822175
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.011807360224025397
},
"harness|gsm8k|5": {
"acc": 0.27065959059893857,
"acc_stderr": 0.012238245006183405
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
heliosprime/twitter_dataset_1713057098 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 15102
num_examples: 34
download_size: 9313
dataset_size: 15102
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713057098"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davidberenstein1957/ultra-feedback-dutch-cleaned-hq | ---
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 10327337
num_examples: 21577
- name: test
num_bytes: 517072
num_examples: 1136
download_size: 6680583
dataset_size: 10844409
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
18moumi/data_docs.jsonl | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 44839
num_examples: 142
download_size: 20983
dataset_size: 44839
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data_docs.jsonl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carnival13/massive_eng_DA3_tokenized | ---
dataset_info:
features:
- name: pass_label
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 97253830
num_examples: 138200
download_size: 22040467
dataset_size: 97253830
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "massive_eng_DA3_tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/shameimaru_aya_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shameimaru_aya/射命丸文/샤메이마루아야 (Touhou)
This is the dataset of shameimaru_aya/射命丸文/샤메이마루아야 (Touhou), containing 500 images and their tags.
The core tags of this character are `hat, short_hair, tokin_hat, red_eyes, black_hair, breasts, wings, black_wings, red_headwear, bangs, pointy_ears`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 544.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shameimaru_aya_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 339.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shameimaru_aya_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1094 | 661.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shameimaru_aya_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 493.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shameimaru_aya_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1094 | 887.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shameimaru_aya_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shameimaru_aya_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, black_skirt, collared_shirt, pom_pom_(clothes), puffy_short_sleeves, solo, white_shirt, bird_wings, black_ribbon, frilled_skirt, holding, looking_at_viewer, smile, feathered_wings, open_mouth, simple_background, hair_between_eyes, white_background |
| 1 | 8 |  |  |  |  |  | 1girl, black_skirt, collared_shirt, frilled_skirt, pom_pom_(clothes), red_footwear, solo, tengu-geta, white_shirt, black_ribbon, looking_at_viewer, simple_background, smile, white_background, closed_mouth, full_body, hauchiwa, holding_fan, bird_wings, feathered_wings, belt, black_socks, kneehighs, neck_ribbon, puffy_short_sleeves, standing, white_socks |
| 2 | 5 |  |  |  |  |  | 1girl, black_bowtie, collared_shirt, hair_between_eyes, puffy_short_sleeves, solo, white_shirt, black_skirt, blush, looking_at_viewer, simple_background, white_background, buttons, holding, open_mouth, pom_pom_(clothes), :d, belt, bird_wings, closed_mouth, cowboy_shot, feathered_wings, frilled_skirt, large_breasts, upper_body |
| 3 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, skirt, solo, smile, hand_fan, tengu-geta, feathers, hauchiwa |
| 4 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, open_shirt, solo, nipples, blush, medium_breasts, no_bra, open_mouth, panties |
| 5 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, nipples, solo, censored, medium_breasts, nude, pussy, anus, open_mouth, spread_legs, tears, cum, navel, on_back |
| 6 | 7 |  |  |  |  |  | 1girl, nipples, cum_on_breasts, large_breasts, looking_at_viewer, open_mouth, blush, hetero, 1boy, facial, penis, solo_focus, medium_breasts, open_shirt, smile |
| 7 | 5 |  |  |  |  |  | 1boy, 1girl, cowgirl_position, cum_in_pussy, girl_on_top, hetero, looking_at_viewer, navel, nipples, penis, sex, solo_focus, vaginal, blush, hair_between_eyes, large_breasts, mosaic_censoring, collarbone, completely_nude, indoors, open_mouth, overflow, pom_pom_(clothes), pov, spread_legs, closed_mouth, medium_breasts, pubic_hair, smile, stomach, sweat, thighs |
| 8 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, penis, solo_focus, large_breasts, navel, nude, open_mouth, sex, spread_legs, vaginal, bar_censor, cum_in_pussy, kneehighs, on_side, pom_pom_(clothes) |
| 9 | 7 |  |  |  |  |  | 1girl, blush, solo, looking_at_viewer, white_panties, medium_breasts, open_mouth, skirt |
| 10 | 22 |  |  |  |  |  | 1girl, solo, looking_at_viewer, kourindou_tengu_costume, smile, japanese_clothes, obi, pom_pom_(clothes), wide_sleeves, detached_sleeves, open_mouth, ribbon_trim |
| 11 | 6 |  |  |  |  |  | 1girl, big_belly, blush, fat, large_breasts, skirt, solo, bursting_breasts, undersized_clothes, collared_shirt, d:, looking_at_viewer, navel, open_mouth, plump, pom_pom_(clothes), sweat, tengu-geta, thick_thighs, v-shaped_eyebrows |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_skirt | collared_shirt | pom_pom_(clothes) | puffy_short_sleeves | solo | white_shirt | bird_wings | black_ribbon | frilled_skirt | holding | looking_at_viewer | smile | feathered_wings | open_mouth | simple_background | hair_between_eyes | white_background | red_footwear | tengu-geta | closed_mouth | full_body | hauchiwa | holding_fan | belt | black_socks | kneehighs | neck_ribbon | standing | white_socks | black_bowtie | blush | buttons | :d | cowboy_shot | large_breasts | upper_body | skirt | hand_fan | feathers | open_shirt | nipples | medium_breasts | no_bra | panties | censored | nude | pussy | anus | spread_legs | tears | cum | navel | on_back | cum_on_breasts | hetero | 1boy | facial | penis | solo_focus | cowgirl_position | cum_in_pussy | girl_on_top | sex | vaginal | mosaic_censoring | collarbone | completely_nude | indoors | overflow | pov | pubic_hair | stomach | sweat | thighs | bar_censor | on_side | white_panties | kourindou_tengu_costume | japanese_clothes | obi | wide_sleeves | detached_sleeves | ribbon_trim | big_belly | fat | bursting_breasts | undersized_clothes | d: | plump | thick_thighs | v-shaped_eyebrows |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------------|:-----------------|:--------------------|:----------------------|:-------|:--------------|:-------------|:---------------|:----------------|:----------|:--------------------|:--------|:------------------|:-------------|:--------------------|:--------------------|:-------------------|:---------------|:-------------|:---------------|:------------|:-----------|:--------------|:-------|:--------------|:------------|:--------------|:-----------|:--------------|:---------------|:--------|:----------|:-----|:--------------|:----------------|:-------------|:--------|:-----------|:-----------|:-------------|:----------|:-----------------|:---------|:----------|:-----------|:-------|:--------|:-------|:--------------|:--------|:------|:--------|:----------|:-----------------|:---------|:-------|:---------|:--------|:-------------|:-------------------|:---------------|:--------------|:------|:----------|:-------------------|:-------------|:------------------|:----------|:-----------|:------|:-------------|:----------|:--------|:---------|:-------------|:----------|:----------------|:--------------------------|:-------------------|:------|:---------------|:-------------------|:--------------|:------------|:------|:-------------------|:---------------------|:-----|:--------|:---------------|:--------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | X | | X | X | X | X | X | | | X | | | | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | | | | | X | | | | | | X | X | | | | | | | X | | | X | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | | | | | X | | | | | | X | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | | X | | | | | | X | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | | | | | | | | | | | X | X | | X | | | | | | | | | | | | | | | | | X | | | | X | | | | | X | X | X | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | X | | | | | | | | X | X | | X | | X | | | | X | | | | | | | | | | | X | | | | X | | | | | | X | X | | | | | | | X | | | X | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | X | | | X | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | X | | | | X | | | | | | X | | | | | X | | | X | | | X | | | X | X | | X | X | | X | | X | X | | | | | | | | | | | X | X | | | | | | | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | | | | | X | | | | | | X | | | X | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | |
| 10 | 22 |  |  |  |  |  | X | | | X | | X | | | | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | |
| 11 | 6 |  |  |  |  |  | X | | X | X | | X | | | | | | X | | | X | | | | | X | | | | | | | | | | | | X | | | | X | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
boblebgdu92/buzz | ---
license: other
---
|
Seanxh/twitter_dataset_1713204711 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 134471
num_examples: 315
download_size: 51234
dataset_size: 134471
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nightmare-nectarine/segmentation-carla-driving | ---
license: mit
language:
- en
tags:
- Autonomous Driving
- CARLA Simulator
- ImitationLearning
size_categories:
- 10B<n<100B
pretty_name: S
---
This dataset consists of 80 episodes of driving data collected using an autopilot agent in CARLA simulator for training imitation learning models for autonomous driving tasks.
Each frame is structured as follows:
```
frame_data = {
'frame': the frame index,
'hlc': an integer representing the high-level command,
'light': an integer representing current traffic light status,
'controls': an array of [throttle, steer, brake],
'measurements': current speed in km/h,
'rgb': rgb camera image,
'segmentation': ground truth segmentation image,
}
```
This dataset is used in [this project](https://github.com/TheRoboticsClub/gsoc2023-Meiqi_Zhao) and the trained models are available [here](https://huggingface.co/nightmare-nectarine/segmentation-based-imitation-learning-in-CARLA). Check out the [example code](https://github.com/TheRoboticsClub/gsoc2023-Meiqi_Zhao/blob/main/src/ModifiedDeepestLSTMTinyPilotNet/utils/load_dataset.py) for loading the dataset. |
showchen/zero-liuqin | ---
license: apache-2.0
---
|
SEACrowd/indspeech_teldialog_svcsr | ---
tags:
- speech-recognition
language:
- ind
---
# indspeech_teldialog_svcsr
This is the first Indonesian speech dataset for small vocabulary continuous speech recognition (SVCSR).
The data was developed by TELKOMRisTI (R&D Division, PT Telekomunikasi Indonesia) in collaboration with Advanced
Telecommunication Research Institute International (ATR) Japan and Bandung Institute of Technology (ITB) under the
Asia-Pacific Telecommunity (APT) project in 2004 [Sakti et al., 2004]. Although it was originally developed for
a telecommunication system for hearing and speaking impaired people, it can be used for other applications,
i.e., automatic call centers. Furthermore, as all speakers utter the same sentences,
it can also be used for voice conversion tasks.
The text is based on a word vocabulary which is derived from some necessary dialog calls,
such as dialog calls with the 119 emergency department, 108 telephone information department,
and ticket reservation department. In total, it consists of 20,000 utterances (about 18 hours of speech) from the
70-word dialog vocabulary of 100 sentences (including single word sentences) each uttered by 200 speakers
(100 Females, 100 Males). The age is limited to middle age (20-40 years), but they present a wide range of spoken
dialects from different ethnic groups. The recording is conducted in parallel for both clean and telephone speech,
but we open only the clean speech due to quality issues on telephone speech.
Each audio file is a single-channel 16-bit PCM WAV with a sample rate of 16000 Hz.
These utterances are equally split into training and test sets with 100 speakers (50 Females, 50 Males) in each set.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@inproceedings{sakti-icslp-2004,
title = "Indonesian Speech Recognition for Hearing and Speaking Impaired People",
author = "Sakti, Sakriani and Hutagaol, Paulus and Arman, Arry Akhmad and Nakamura, Satoshi",
booktitle = "Proc. International Conference on Spoken Language Processing (INTERSPEECH - ICSLP)",
year = "2004",
pages = "1037--1040"
address = "Jeju Island, Korea"
}
```
## License
CC-BY-NC-SA-4.0
## Homepage
[https://github.com/s-sakti/data_indsp_teldialog_svcsr/](https://github.com/s-sakti/data_indsp_teldialog_svcsr/)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
emilykang/pathology_test | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 153955430.0
num_examples: 300
download_size: 153709138
dataset_size: 153955430.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vwxyzjn/openhermes-dev__combined__1708612612 | ---
dataset_info:
features:
- name: source
dtype: string
- name: category
dtype: string
- name: prompt
dtype: string
- name: candidates_completions
sequence: string
- name: candidate_policies
sequence: string
splits:
- name: train
num_bytes: 3782741722
num_examples: 997984
download_size: 1787559630
dataset_size: 3782741722
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gngpostalsrvc/COHeN | ---
license: mit
---
The COHeN dataset is designed for training and fine-tuning Biblical Hebrew classification models. It consists of 11968 verses from the Hebrew Bible along with labels indicating to which of the four chronological phases of the language each verse belongs. The following verses were chosen to represent each stage:
- Archaic Biblical Hebrew (ABH): Gen 49:2–27; Exod 15:1–18; Num 23:7–10, 18–24; 24:3–9; 15–24; Deut 32:1–43; Deut 33:2–39; Jud 5:2–31; 2 Sam 22:2–51; Psa 18:1–50; 68:2–35 (minus the prose introductions in Exod 15:1; Num 23:7, 18; Num 24:3, 15; Deut 33:2; 2 Sam 22:2; Ps 18:2)
- Classical Biblical Hebrew (CBH): 1 Sam 1:1–31:13; 2 Sam 1:1–22:1; 23:1–24:25; 1 Kgs 1:1–22:53; 2 Kgs 1:1–25:30
- Transitional Biblical Hebrew (TBH): Isa 40:1–54:17; Jer 1:1–10:10; 10:12–42:34; Ezel 1:1–48:35
- Late Biblical Hebrew (LBH): Eccl 1:1–12:14; Esth 1:1–10:3; Dan 1:1–2:4a; Dan 7:29–12:13, Ezra 1:1–4:6; 6:19–7:11; 7:26–10:44; Neh 1:1–13:31; 1 Chr 1:1–9:44; 12:1–40; 15:1–24; 16:7–43; 21:1–29:19; 2 Chr 7:1–3; 14:9–15:7; 17:1–19; 21:12–17; 24:25–22; 26:6–21; 29:3–31:21; 33:10–20; 34:3–7; 36:22–23
Class balance was achieved by oversampling from the three minority classes (ABH, TBH, and LBH).
The script used to generate the COHeN dataset can be found [here](https://github.com/gngpostalsrvc/COHeN/blob/main/input/generate_COHeN_dataset.py).
|
mvkvc/stack_elixir | ---
dataset_info:
features:
- name: hexsha
dtype: string
- name: size
dtype: int64
- name: ext
dtype: string
- name: lang
dtype: string
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_repo_head_hexsha
dtype: string
- name: max_stars_repo_licenses
sequence: string
- name: max_stars_count
dtype: int64
- name: max_stars_repo_stars_event_min_datetime
dtype: string
- name: max_stars_repo_stars_event_max_datetime
dtype: string
- name: max_issues_repo_path
dtype: string
- name: max_issues_repo_name
dtype: string
- name: max_issues_repo_head_hexsha
dtype: string
- name: max_issues_repo_licenses
sequence: string
- name: max_issues_count
dtype: int64
- name: max_issues_repo_issues_event_min_datetime
dtype: string
- name: max_issues_repo_issues_event_max_datetime
dtype: string
- name: max_forks_repo_path
dtype: string
- name: max_forks_repo_name
dtype: string
- name: max_forks_repo_head_hexsha
dtype: string
- name: max_forks_repo_licenses
sequence: string
- name: max_forks_count
dtype: int64
- name: max_forks_repo_forks_event_min_datetime
dtype: string
- name: max_forks_repo_forks_event_max_datetime
dtype: string
- name: content
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphanum_fraction
dtype: float64
splits:
- name: train
num_bytes: 2590151574
num_examples: 594074
download_size: 973079076
dataset_size: 2590151574
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "stack_elixir"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Eitanli/wine_type_bu | ---
dataset_info:
features:
- name: id
dtype: int64
- name: recipe
dtype: string
- name: wine_type
dtype: string
splits:
- name: train
num_bytes: 110426494
num_examples: 74465
download_size: 54694496
dataset_size: 110426494
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wine_type_bu"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Haary/train_usk_qna_alpaca_id | ---
license: llama2
---
# Syiah Kuala University Dataset V1
## Source
- [ULT USK](http://ult.usk.ac.id/) (Unit Layanan Terpadu USK)
## Content
1. Data QnA bagi Mahasiswa baru
## Contact
- You Can Contact Me at Email : haryrachmat10@gmail.com |
CyberHarem/lakshmi_bai_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lakshmi_bai/ラクシュミー・バーイー/拉克什米·芭伊 (Fate/Grand Order)
This is the dataset of lakshmi_bai/ラクシュミー・バーイー/拉克什米·芭伊 (Fate/Grand Order), containing 166 images and their tags.
The core tags of this character are `long_hair, dark-skinned_female, dark_skin, white_hair, braid, twin_braids, breasts, very_long_hair, red_eyes, ahoge, hair_between_eyes, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 166 | 244.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lakshmi_bai_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 166 | 210.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lakshmi_bai_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 401 | 387.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lakshmi_bai_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lakshmi_bai_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 37 |  |  |  |  |  | 1girl, bare_shoulders, solo, sleeveless_shirt, looking_at_viewer, white_gloves, holding_sword, white_pants, white_shirt, thigh_boots, thighhighs, sheath, holding_gun, rifle, sideboob |
| 1 | 6 |  |  |  |  |  | 1girl, bare_shoulders, sleeveless_shirt, solo, armpits, looking_at_viewer, upper_body, white_gloves, white_shirt, arms_behind_head, arms_up, simple_background, white_background, blush, closed_mouth, open_mouth |
| 2 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, navel, solo, white_bikini, bare_shoulders, cleavage, smile, thighs, holding, large_breasts, collarbone, jewelry, sarong, simple_background, weapon, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | solo | sleeveless_shirt | looking_at_viewer | white_gloves | holding_sword | white_pants | white_shirt | thigh_boots | thighhighs | sheath | holding_gun | rifle | sideboob | armpits | upper_body | arms_behind_head | arms_up | simple_background | white_background | blush | closed_mouth | open_mouth | navel | white_bikini | cleavage | smile | thighs | holding | large_breasts | collarbone | jewelry | sarong | weapon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:-------------------|:--------------------|:---------------|:----------------|:--------------|:--------------|:--------------|:-------------|:---------|:--------------|:--------|:-----------|:----------|:-------------|:-------------------|:----------|:--------------------|:-------------------|:--------|:---------------|:-------------|:--------|:---------------|:-----------|:--------|:---------|:----------|:----------------|:-------------|:----------|:---------|:---------|
| 0 | 37 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | | X | | | | | | | | | | | | | | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X |
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/68759f6d | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 176
num_examples: 10
download_size: 1331
dataset_size: 176
---
# Dataset Card for "68759f6d"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_119 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1378701536.0
num_examples: 268648
download_size: 1412894250
dataset_size: 1378701536.0
---
# Dataset Card for "chunk_119"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/mr-tydi_ru_dev | ---
pretty_name: '`mr-tydi/ru/dev`'
viewer: false
source_datasets: ['irds/mr-tydi_ru']
task_categories:
- text-retrieval
---
# Dataset Card for `mr-tydi/ru/dev`
The `mr-tydi/ru/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mr-tydi#mr-tydi/ru/dev).
# Data
This dataset provides:
- `queries` (i.e., topics); count=1,375
- `qrels`: (relevance assessments); count=1,375
- For `docs`, use [`irds/mr-tydi_ru`](https://huggingface.co/datasets/irds/mr-tydi_ru)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mr-tydi_ru_dev', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mr-tydi_ru_dev', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Zhang2021MrTyDi,
title={{Mr. TyDi}: A Multi-lingual Benchmark for Dense Retrieval},
author={Xinyu Zhang and Xueguang Ma and Peng Shi and Jimmy Lin},
year={2021},
journal={arXiv:2108.08787},
}
@article{Clark2020TyDiQa,
title={{TyDi QA}: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages},
author={Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki},
year={2020},
journal={Transactions of the Association for Computational Linguistics}
}
```
|
DTU54DL/dmeo | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- found
license:
- mit
multilinguality:
- monolingual
paperswithcode_id: acronym-identification
pretty_name: Acronym Identification Dataset
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- token-classification-other-acronym-identification
train-eval-index:
- col_mapping:
labels: tags
tokens: tokens
config: default
splits:
eval_split: test
task: token-classification
task_id: entity_extraction
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset.
|
mcmanaman/autotrain-data-34qj-8hnp-las6 | ---
dataset_info:
features:
- name: Target
dtype: string
- name: autotrain_text
dtype: string
splits:
- name: train
num_bytes: 1029
num_examples: 30
- name: validation
num_bytes: 1029
num_examples: 30
download_size: 4432
dataset_size: 2058
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "autotrain-data-34qj-8hnp-las6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thusinh1969/VN_DPO_NO_SPIN_1024_prompt_2048_maxlength | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 1054821208.5144515
num_examples: 332528
- name: test
num_bytes: 3172127.48554844
num_examples: 1000
download_size: 458881488
dataset_size: 1057993336.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/ikazuchi_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ikazuchi/雷/雷 (Kantai Collection)
This is the dataset of ikazuchi/雷/雷 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `brown_hair, short_hair, hairclip, hair_ornament, brown_eyes, fang`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 464.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ikazuchi_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 313.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ikazuchi_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1167 | 659.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ikazuchi_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 429.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ikazuchi_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1167 | 863.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ikazuchi_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ikazuchi_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, open_mouth, serafuku, skirt, solo, looking_at_viewer, black_pantyhose, neckerchief, :d |
| 1 | 13 |  |  |  |  |  | 1girl, open_mouth, serafuku, solo, looking_at_viewer, blush, :d, red_neckerchief |
| 2 | 20 |  |  |  |  |  | 1girl, open_mouth, pleated_skirt, red_neckerchief, serafuku, solo, anchor_symbol, black_sailor_collar, black_skirt, looking_at_viewer, long_sleeves, smile, simple_background, white_background, blush, skin_fang, cowboy_shot |
| 3 | 5 |  |  |  |  |  | 1girl, black_sailor_collar, red_neckerchief, serafuku, simple_background, upper_body, white_background, anchor_symbol, looking_at_viewer, solo, hair_between_eyes, grin, open_mouth |
| 4 | 11 |  |  |  |  |  | 1girl, black_thighhighs, looking_at_viewer, serafuku, solo, open_mouth, zettai_ryouiki, long_sleeves, neckerchief, blush, pleated_skirt, :d |
| 5 | 6 |  |  |  |  |  | 1girl, anchor_symbol, cat_ears, kemonomimi_mode, pleated_skirt, red_neckerchief, serafuku, solo, black_thighhighs, cat_tail, long_sleeves, looking_at_viewer, open_mouth, paw_pose, smile, white_background, black_skirt, simple_background |
| 6 | 5 |  |  |  |  |  | 1girl, blush, pleated_skirt, serafuku, smile, solo, valentine, gift_box, heart-shaped_box, looking_at_viewer, open_mouth, red_neckerchief, black_thighhighs, incoming_gift, long_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | open_mouth | serafuku | skirt | solo | looking_at_viewer | black_pantyhose | neckerchief | :d | blush | red_neckerchief | pleated_skirt | anchor_symbol | black_sailor_collar | black_skirt | long_sleeves | smile | simple_background | white_background | skin_fang | cowboy_shot | upper_body | hair_between_eyes | grin | black_thighhighs | zettai_ryouiki | cat_ears | kemonomimi_mode | cat_tail | paw_pose | valentine | gift_box | heart-shaped_box | incoming_gift |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:-----------|:--------|:-------|:--------------------|:------------------|:--------------|:-----|:--------|:------------------|:----------------|:----------------|:----------------------|:--------------|:---------------|:--------|:--------------------|:-------------------|:------------|:--------------|:-------------|:--------------------|:-------|:-------------------|:-----------------|:-----------|:------------------|:-----------|:-----------|:------------|:-----------|:-------------------|:----------------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 20 |  |  |  |  |  | X | X | X | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | | X | X | | | | | X | | X | X | | | | X | X | | | X | X | X | | | | | | | | | | |
| 4 | 11 |  |  |  |  |  | X | X | X | | X | X | | X | X | X | | X | | | | X | | | | | | | | | X | X | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | X | | X | X | | | | | X | X | X | | X | X | X | X | X | | | | | | X | | X | X | X | X | | | | |
| 6 | 5 |  |  |  |  |  | X | X | X | | X | X | | | | X | X | X | | | | X | X | | | | | | | | X | | | | | | X | X | X | X |
|
hmao/cvecpe_apis | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: args_dicts
list:
- name: default
dtype: string
- name: description
dtype: string
- name: name
dtype: string
- name: required
dtype: bool
- name: type
dtype: string
- name: returns
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: dataset
dtype: string
- name: name
dtype: string
- name: api_type
dtype: string
- name: description
dtype: string
splits:
- name: train
num_bytes: 19587
num_examples: 14
download_size: 19268
dataset_size: 19587
---
# Dataset Card for "cvecpe_apis"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_236 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1210541528.0
num_examples: 237734
download_size: 1233607237
dataset_size: 1210541528.0
---
# Dataset Card for "chunk_236"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/coref_100_1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: pronoun
dtype: string
- name: candidates
sequence: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 11705
num_examples: 89
download_size: 10762
dataset_size: 11705
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "coref_100_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/joffre_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of joffre/ジョッフル/霞飞 (Azur Lane)
This is the dataset of joffre/ジョッフル/霞飞 (Azur Lane), containing 73 images and their tags.
The core tags of this character are `breasts, twintails, large_breasts, hair_ornament, bangs, red_eyes, grey_hair, long_hair, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 73 | 135.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/joffre_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 73 | 65.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/joffre_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 190 | 151.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/joffre_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 73 | 114.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/joffre_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 190 | 230.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/joffre_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/joffre_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, cleavage, holding_sword, looking_at_viewer, solo, white_dress, black_gloves, black_choker, fingerless_gloves, white_thighhighs, wide_sleeves, feathered_wings, juliet_sleeves, simple_background, black_wings, medium_breasts, white_background |
| 1 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_dress, black_gloves, cleavage, white_thighhighs, fingerless_gloves, parted_lips, black_choker, juliet_sleeves, thighs, medium_breasts, wide_sleeves |
| 2 | 5 |  |  |  |  |  | 1girl, blue_dress, cleavage, crown, hair_bow, looking_at_viewer, sitting, solo, white_thighhighs, pink_eyes, bare_shoulders, black_bow, blue_footwear, butterfly, garter_straps, high_heels, red_cape, wrist_cuffs, ass, detached_sleeves, frills, hair_between_eyes, official_alternate_costume, parted_lips, puffy_sleeves, sidelocks, simple_background, smile, white_background |
| 3 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, open_mouth, black_gloves, penis, solo_focus, bar_censor, fingerless_gloves, sex, vaginal, black_choker, breasts_out, cross-section, cum_in_pussy, cum_on_breasts, grabbing, internal_cumshot, purple_eyes, sweat, uterus |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | holding_sword | looking_at_viewer | solo | white_dress | black_gloves | black_choker | fingerless_gloves | white_thighhighs | wide_sleeves | feathered_wings | juliet_sleeves | simple_background | black_wings | medium_breasts | white_background | parted_lips | thighs | blue_dress | crown | hair_bow | sitting | pink_eyes | bare_shoulders | black_bow | blue_footwear | butterfly | garter_straps | high_heels | red_cape | wrist_cuffs | ass | detached_sleeves | frills | hair_between_eyes | official_alternate_costume | puffy_sleeves | sidelocks | smile | 1boy | blush | hetero | nipples | open_mouth | penis | solo_focus | bar_censor | sex | vaginal | breasts_out | cross-section | cum_in_pussy | cum_on_breasts | grabbing | internal_cumshot | purple_eyes | sweat | uterus |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:----------------|:--------------------|:-------|:--------------|:---------------|:---------------|:--------------------|:-------------------|:---------------|:------------------|:-----------------|:--------------------|:--------------|:-----------------|:-------------------|:--------------|:---------|:-------------|:--------|:-----------|:----------|:------------|:-----------------|:------------|:----------------|:------------|:----------------|:-------------|:-----------|:--------------|:------|:-------------------|:---------|:--------------------|:-----------------------------|:----------------|:------------|:--------|:-------|:--------|:---------|:----------|:-------------|:--------|:-------------|:-------------|:------|:----------|:--------------|:----------------|:---------------|:-----------------|:-----------|:-------------------|:--------------|:--------|:---------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | X | X | X | X | X | X | X | X | | X | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | X | X | | | | | X | | | | X | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
projectbaraat/hindi-instruct-dataset-v0.1 | ---
dataset_info:
features:
- name: instruction
list:
- name: content
dtype: string
- name: role
dtype: string
- name: output
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 2804634827.6821947
num_examples: 668442
download_size: 819390520
dataset_size: 2804634827.6821947
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
santhoshmlops/Skai_Gemma_Instruct_ChatTemplate | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 14731791
num_examples: 15011
download_size: 7545887
dataset_size: 14731791
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- question-answering
language:
- en
size_categories:
- 10K<n<100K
--- |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.