datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
ssbuild/alpaca_gpt4all | ---
license: apache-2.0
---
|
C-MTEB/CovidRetrieval-qrels | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
dataset_info:
features:
- name: qid
dtype: string
- name: pid
dtype: string
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 76720
num_examples: 959
download_size: 62785
dataset_size: 76720
---
# Dataset Card for "CovidRetrieval-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vgaraujov/fapesp-v2 | ---
language:
- en
- es
- pt
license:
- cc-by-2.0
multilinguality:
- translation
task_categories:
- translation
task_ids: []
dataset_info:
- config_name: en-pt
features:
- name: translation
dtype:
translation:
languages:
- en
- pt
splits:
- name: train
num_bytes: 47417503
num_examples: 160975
- name: validation
num_bytes: 405055
num_examples: 1375
- name: test
num_bytes: 407579
num_examples: 1447
download_size: 29615550
dataset_size: 48230137
- config_name: es-pt
features:
- name: translation
dtype:
translation:
languages:
- es
- pt
splits:
- name: train
num_bytes: 47480897
num_examples: 158197
- name: validation
num_bytes: 377101
num_examples: 1302
- name: test
num_bytes: 400915
num_examples: 1379
download_size: 29829573
dataset_size: 48258913
configs:
- config_name: en-pt
data_files:
- split: train
path: en-pt/train-*
- split: validation
path: en-pt/validation-*
- split: test
path: en-pt/test-*
- config_name: es-pt
data_files:
- split: train
path: es-pt/train-*
- split: validation
path: es-pt/validation-*
- split: test
path: es-pt/test-*
---
|
openlifescienceai/mmlu_medical_genetics | ---
dataset_info:
features:
- name: subject_name
dtype: string
- name: data
struct:
- name: Correct Answer
dtype: string
- name: Correct Option
dtype: string
- name: Options
struct:
- name: A
dtype: string
- name: B
dtype: string
- name: C
dtype: string
- name: D
dtype: string
- name: Question
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 29282
num_examples: 100
- name: validation
num_bytes: 4101
num_examples: 11
- name: dev
num_bytes: 1521
num_examples: 5
download_size: 43806
dataset_size: 34904
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
- split: dev
path: data/dev-*
---
|
nlhappy/DuIE | ---
license: mit
---
|
19kmunz/iot-23-preprocessed | ---
dataset_info:
features:
- name: id.orig_p
dtype: int64
- name: id.resp_p
dtype: int64
- name: proto
dtype: string
- name: service
dtype: string
- name: duration
dtype: float64
- name: orig_bytes
dtype: int64
- name: resp_bytes
dtype: int64
- name: conn_state
dtype: string
- name: missed_bytes
dtype: int64
- name: history
dtype: string
- name: orig_pkts
dtype: int64
- name: orig_ip_bytes
dtype: int64
- name: resp_pkts
dtype: int64
- name: resp_ip_bytes
dtype: int64
- name: label
dtype: string
splits:
- name: train
num_bytes: 93994789
num_examples: 819024
download_size: 11805369
dataset_size: 93994789
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- question-answering
- tabular-classification
language:
- en
tags:
- code
pretty_name: d
---
# Aposemat IoT-23 - a Labeled Dataset with Malcious and Benign Iot Network Traffic
**Homepage:** [https://www.stratosphereips.org/datasets-iot23](https://www.stratosphereips.org/datasets-iot23)
This dataset contains a subset of the data from 20 captures of Malcious network traffic and 3 captures from live Benign Traffic on Internet of Things (IoT) devices. Created by Sebastian Garcia, Agustin Parmisano, & Maria Jose Erquiaga at the Avast AIC laboratory with the funding of Avast Software, this dataset is one of the best in the field for Intrusion Detection Systems (IDS) for IoT Devices [(Comparative Analysis of IoT Botnet Datasets)](https://doi.org/10.53070/bbd.1173687).
The selection of the subset was determined by [Aqeel Ahmed on Kaggle](https://www.kaggle.com/datasets/engraqeel/iot23preprocesseddata) and contains 6 million samples. The Kaggle upload, nor this one, have employed data balancing. The Kaggle card does not contain methodology to understand what criteria was used to select these samples. If you want ensure best practice, use this dataset to mock-up processing the data into a model before using the full dataset with data balancing. This will require processing the 8GB of conn.log.labelled files.
This dataset only notes if the data is Malcious or Benign. The original dataset labels the type of malcious traffic aswell. This means this processing of the dataset is only suited for binary classification.
# Feature information:
All features originate from the [Zeek](https://docs.zeek.org/en/master/scripts/base/protocols/conn/main.zeek.html#type-Conn::Info) processing performed by the dataset creators. [See notes here for caviats for each column](https://docs.zeek.org/en/master/scripts/base/protocols/conn/main.zeek.html#type-Conn::Info).
<details>
<summary>Expand for feature names, descriptions, and datatypes</summary>
Name: id.orig_p
Description: The originator’s port number.
Data type: int64 - uint64 in original
Name: id.resp_p
Description: The responder’s port number.
Data type: int64 - uint64 in original
Name: proto
Description: The transport layer protocol of the connection.
Data type: string - enum(unknown_transport, tcp, udp, icmp). Only TCP and UDP in subset
Name: service
Description: An identification of an application protocol being sent over the connection.
Data type: optional string
Name: duration
Description: How long the connection lasted.
Data type: optional float64 - time interval
Name: orig_bytes
Description: The number of payload bytes the originator sent.
Data type: optional int64 - uint64 in original
Name: resp_bytes
Description:The number of payload bytes the responder sent.
Data type: optional int64 - uint64 in original
Name: conn_state
Description: Value indicating connection state. (S0, S1, SF, REJ, S2, S3, RSTO, RSTR, RSTOS0, RSTRH, SH, SHR, OTH)
Data type: optional string
Name: missed_bytes
Description: Indicates the number of bytes missed in content gaps, which is representative of packet loss.
Data type: optional int64 - uint64 in original. default = 0
Name: history
Description: Records the state history of connections as a string of letters.
Data type: optional string
Name: orig_pkts
Description: Number of packets that the originator sent.
Data type: optional int64 - uint64 in original
Name: orig_ip_bytes
Description: Number of IP level bytes that the originator sent.
Data type: optional int64 - uint64 in original
Name: resp_pkts
Description: Number of packets that the responder sent.
Data type: optional int64 - uint64 in original
Name: resp_ip_bytes
Description: Number of IP level bytes that the responder sent.
Data type: optional int64 - uint64 in original
Name: label
Description: Specifies if data point is benign or some form of malicious. See the dataset creators paper for descriptions of attack types
Data type: string - enum(Malicious, Benign)
NOTE: ts, uid, id.orig_h, id.resp_h have been removed as they are dataset specific. Models should not be trained with specific timestamps or IP addresses (id.orig_h) using this dataset, as that can lead to over fitting to dataset specific times and addresses.
Further local_orig, local_resp have been removed as they are null in all rows, so they are useless for training.
</details>
## Citation
If you are using this dataset for your research, please reference it as “Sebastian Garcia, Agustin Parmisano, & Maria Jose Erquiaga. (2020). IoT-23: A labeled dataset with malicious and benign IoT network traffic (Version 1.0.0) [Data set]. Zenodo. http://doi.org/10.5281/zenodo.4743746” |
bellagio-ai/sample-uploaded-dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: package_name
dtype: string
- name: review
dtype: string
- name: date
dtype: string
- name: star
dtype: int64
- name: version_id
dtype: int64
splits:
- name: train
num_bytes: 1508
num_examples: 5
- name: test
num_bytes: 956
num_examples: 5
download_size: 0
dataset_size: 2464
---
# Dataset Card for "sample-uploaded-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Lollitor/FSONLYPOCKET | ---
dataset_info:
features:
- name: '#code'
dtype: string
- name: inputs
dtype: string
splits:
- name: train
num_bytes: 2061405
num_examples: 16245
download_size: 126658
dataset_size: 2061405
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "FSONLYPOCKET"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
midojiang/frist-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': ADONIS
'1': AFRICAN GIANT SWALLOWTAIL
'2': AMERICAN SNOOT
splits:
- name: train
num_bytes: 8825732.0
num_examples: 338
download_size: 8823395
dataset_size: 8825732.0
---
# Dataset Card for "input-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jlvdoorn/atco2-asr-atcosim | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: info
dtype: string
splits:
- name: train
num_bytes: 2029124649.948
num_examples: 8092
- name: validation
num_bytes: 508032748.446
num_examples: 2026
download_size: 2524947331
dataset_size: 2537157398.394
task_categories:
- automatic-speech-recognition
language:
- en
tags:
- air traffic control
- automatic speech recognition
- natural language processing
- atc
- asr
- nlp
- atco2
- atcosim
pretty_name: ATCO2-ASR-ATCOSIM
---
# Dataset Card for "atco2-asr-atcosim"
This is a dataset constructed from two datasets: [ATCO2-ASR](https://huggingface.co/datasets/jlvdoorn/atco2-asr) and [ATCOSIM](https://huggingface.co/datasets/jlvdoorn/atcosim).
It is divided into 80% train and 20% validation by selecting files randomly. Some of the files have additional information that is presented in the 'info' file. |
pysentimiento/spanish-tweets-small | ---
dataset_info:
features:
- name: text
dtype: string
- name: tweet_id
dtype: string
- name: user_id
dtype: string
splits:
- name: train
num_bytes: 82649695458
num_examples: 597433111
- name: test
num_bytes: 892219251
num_examples: 6224733
download_size: 51737237106
dataset_size: 83541914709
---
# spanish-tweets-small
## A smaller version of spanish-tweets
## A corpus of tweets for pretraining embeddings and language models
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage**: https://github.com/pysentimiento/robertuito
- **Paper**: [RoBERTuito: a pre-trained language model for social media text in Spanish](https://aclanthology.org/2022.lrec-1.785/)
- **Point of Contact:** jmperez (at) dc.uba.ar
### Dataset Summary
A big dataset of (mostly) Spanish tweets for pre-training language models (or other representations).
### Supported Tasks and Leaderboards
Language Modeling
### Languages
Mostly Spanish, but some Portuguese, English, and other languages.
## Dataset Structure
### Data Fields
- *tweet_id*: id of the tweet
- *user_id*: id of the user
- *text*: text from the tweet
## Dataset Creation
The full process of data collection is described in the paper. Here we roughly outline the main points:
- A Spritzer collection uploaded to Archive.org dating from May 2019 was downloaded
- From this, we only kept tweets with language metadata equal to Spanish, and mark the users who posted these messages.
- Then, the tweetline from each of these marked users was downloaded.
This corpus consists of ~30M tweets.
Please note that we did not filter tweets from other languages, so you might find English, Portuguese, Catalan and other languages in the dataset (around 7/8% of the tweets are not in Spanish)
### Citation Information
```
@inproceedings{perez-etal-2022-robertuito,
title = "{R}o{BERT}uito: a pre-trained language model for social media text in {S}panish",
author = "P{\'e}rez, Juan Manuel and
Furman, Dami{\'a}n Ariel and
Alonso Alemany, Laura and
Luque, Franco M.",
booktitle = "Proceedings of the Thirteenth Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.lrec-1.785",
pages = "7235--7243",
abstract = "Since BERT appeared, Transformer language models and transfer learning have become state-of-the-art for natural language processing tasks. Recently, some works geared towards pre-training specially-crafted models for particular domains, such as scientific papers, medical documents, user-generated texts, among others. These domain-specific models have been shown to improve performance significantly in most tasks; however, for languages other than English, such models are not widely available. In this work, we present RoBERTuito, a pre-trained language model for user-generated text in Spanish, trained on over 500 million tweets. Experiments on a benchmark of tasks involving user-generated text showed that RoBERTuito outperformed other pre-trained language models in Spanish. In addition to this, our model has some cross-lingual abilities, achieving top results for English-Spanish tasks of the Linguistic Code-Switching Evaluation benchmark (LinCE) and also competitive performance against monolingual models in English Twitter tasks. To facilitate further research, we make RoBERTuito publicly available at the HuggingFace model hub together with the dataset used to pre-train it.",
}
``` |
lapix/CCAgT | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- cc-by-nc-3.0
multilinguality:
- monolingual
paperswithcode_id: null
pretty_name: Images of Cervical Cells with AgNOR Stain Technique
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- image-segmentation
- object-detection
task_ids:
- semantic-segmentation
- instance-segmentation
---
# Dataset Card for Images of Cervical Cells with AgNOR Stain Technique
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [CCAgT homepage](https://data.mendeley.com/datasets/wg4bpm33hj/)
- **Repository:** [CCAgT-utils](https://github.com/johnnv1/CCAgT-utils)
- **Paper:** [Semantic Segmentation for the Detection of Very Small Objects on Cervical Cell Samples Stained with the AgNOR Technique](https://dx.doi.org/10.2139/ssrn.4126881)
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [João G. A. Amorim](mailto:joao.atkinson@posgrad.ufsc.br)
### Dataset Summary
The CCAgT (Images of Cervical Cells with AgNOR Stain Technique) dataset contains 9339 images (1600x1200 resolution where each pixel is 0.111µmX0.111µm) from 15 different slides stained using the AgNOR technique. Each image has at least one label. In total, this dataset has more than 63K instances of annotated object. The images are from the patients of the Gynecology and Colonoscopy Outpatient Clinic of the [Polydoro Ernani de São Thiago University Hospital of the Universidade Federal de Santa Catarina (HU-UFSC)](https://unihospital.ufsc.br/).
### Supported Tasks and Leaderboards
- `image-segmentation`: The dataset can be used to train a model for semantic segmentation or instance segmentation. Semantic segmentation consists in classifying each pixel of the image. Success on this task is typically measured by achieving high values of [mean iou](https://huggingface.co/spaces/evaluate-metric/mean_iou) or [f-score](https://huggingface.co/spaces/evaluate-metric/f1) for pixels results. Instance segmentation consists of doing object detection first and then using a semantic segmentation model inside detected objects. For instances results, this task is typically measured by achieving high values of [recall](https://huggingface.co/spaces/evaluate-metric/recall), [precision](https://huggingface.co/spaces/evaluate-metric/precision) and [f-score](https://huggingface.co/spaces/evaluate-metric/f1).
- `object-detection`: The dataset can be used to train a model for object detection to detect the nuclei categories or the nucleolus organizer regions (NORs), which consists of locating instances of objects and then classifying each one. This task is typically measured by achieving a high values of [recall](https://huggingface.co/spaces/evaluate-metric/recall), [precision](https://huggingface.co/spaces/evaluate-metric/precision) and [f-score](https://huggingface.co/spaces/evaluate-metric/f1).
### Languages
The class labels in the dataset are in English.
## Dataset Structure
### Data Instances
An example looks like the one below:
#### `semantic segmentation` (default configuration)
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=1200x1600 at 0x276021C5EB8>,
'annotation': <PIL.PngImagePlugin.PngImageFile image mode=L size=1200x1600 at 0x385021C5ED7>
}
```
#### `object detection`
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=1200x1600 at 0x276021C5EB8>,
'objects': {
'bbox': [
[36, 7, 13, 32],
[50, 7, 12, 32]
],
'label': [1, 5]
}
```
#### `instance segmentation`
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=1200x1600 at 0x276021C5EB8>,
'objects': {
'bbox': [
[13.3, 7.5, 47.6, 38.3],
[10.2, 7.5, 50.7, 38.3]
],
'segment': [
[[36.2, 7.5, 13.3, 32.1, 52.1, 40.6, 60.9, 45.8, 50.1, 40, 40, 33.2, 35.2]],
[[10.2, 7.5, 10.3, 32.1, 52.1, 40.6, 60.9, 45.8, 50.1, 40, 40, 33.2, 35.2]],
],
'label': [1, 5]
}
```
### Data Fields
The data annotations have the following fields:
#### `semantic segmentation` (default configuration)
- `image`: A `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`.
- `annotation`: A `PIL.Image.Image` object containing the annotation mask. The mask has a single channel and the following pixel values are possible: `BACKGROUND` (0), `NUCLEUS` (1), `CLUSTER` (2), `SATELLITE` (3), `NUCLEUS_OUT_OF_FOCUS` (4), `OVERLAPPED_NUCLEI` (5), `NON_VIABLE_NUCLEUS` (6) and `LEUKOCYTE_NUCLEUS` (7).
#### `object detection`
- `image`: A `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`.
- `objects`: a dictionary containing bounding boxes and labels of the cell objects
- `bbox`: a list of bounding boxes (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format) corresponding to the objects present on the image
- `label`: a list of integers representing the category (7 categories to describe the objects in total; two to differentiate nucleolus organizer regions), with the possible values including `NUCLEUS` (0), `CLUSTER` (1), `SATELLITE` (2), `NUCLEUS_OUT_OF_FOCUS` (3), `OVERLAPPED_NUCLEI` (4), `NON_VIABLE_NUCLEUS` (5) and `LEUKOCYTE_NUCLEUS` (6).
#### `instance segmentation`
- `image`: A `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`.
- `objects`: a dictionary containing bounding boxes and labels of the cell objects
- `bbox`: a list of bounding boxes (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format) corresponding to the objects present on the image
- `segment`: a list of segments in format of `[polygon_0, ..., polygon_n]`, where each polygon is `[x0, y0, ..., xn, yn]`.
- `label`: a list of integers representing the category (7 categories to describe the objects in total; two to differentiate nucleolus organizer regions), with the possible values including `NUCLEUS` (0), `CLUSTER` (1), `SATELLITE` (2), `NUCLEUS_OUT_OF_FOCUS` (3), `OVERLAPPED_NUCLEI` (4), `NON_VIABLE_NUCLEUS` (5) and `LEUKOCYTE_NUCLEUS` (6).
### Data Splits
The data is split randomly using the fixed seed into training, test and validation set. The training data contains 70% of the images and the testing and the validation data contain 15% of the images each. In total, the training set contains 6533 images and the testing and the validation set 1403 images each.
<details>
<summary>
Click here to see additional statistics:
</summary>
| Slide id | Diagnostics | images | annotations | NUCLEUS | CLUSTER | SATELLITE | NUCLEUS_OUT_OF_FOCUS | OVERLAPPED_NUCLEI | NON_VIABLE_NUCLEUS | LEUKOCYTE_NUCLEUS |
| :-------: | :---------: | :----: | :---------: | :-----: | :------: | :-------: | :------------------: | :---------------: | :---------------: | :-------: |
| A | CIN 3 | 1311 | 3164 | 763 | 1038 | 922 | 381 | 46 | 14 | 0 |
| B | SCC | 561 | 911 | 224 | 307 | 112 | 132 | 5 | 1 | 130 |
| C | AC | 385 | 11420 | 2420 | 3584 | 1112 | 1692 | 228 | 477 | 1907 |
| D | CIN 3 | 2125 | 1258 | 233 | 337 | 107 | 149 | 12 | 8 | 412 |
| E | CIN 3 | 506 | 11131 | 2611 | 6249 | 1648 | 476 | 113 | 34 | 0 |
| F | CIN 1 | 318 | 3365 | 954 | 1406 | 204 | 354 | 51 | 326 | 70 |
| G | CIN 2 | 249 | 2759 | 691 | 1279 | 336 | 268 | 49 | 51 | 85 |
| H | CIN 2 | 650 | 5216 | 993 | 983 | 425 | 2562 | 38 | 214 | 1 |
| I | No lesion | 309 | 474 | 56 | 55 | 19 | 170 | 2 | 23 | 149 |
| J | CIN 1 | 261 | 1786 | 355 | 304 | 174 | 743 | 18 | 33 | 159 |
| K | No lesion | 1503 | 13102 | 2464 | 6669 | 638 | 620 | 670 | 138 | 1903 |
| L | CIN 2 | 396 | 3289 | 842 | 796 | 387 | 1209 | 27 | 23 | 5 |
| M | CIN 2 | 254 | 1500 | 357 | 752 | 99 | 245 | 16 | 12 | 19 |
| N | CIN 3 | 248 | 911 | 258 | 402 | 67 | 136 | 10 | 6 | 32 |
| O | AC | 262 | 2904 | 792 | 1549 | 228 | 133 | 88 | 52 | 62 |
| **Total** | - | 9339 | 63190 | 14013 | 25710 | 6478 | 9270 | 1373 | 1412 | 4934 |
Lesion types:
- Cervical intraepithelial neoplasia 1 - CIN 1
- Cervical intraepithelial neoplasia 2 - CIN 2
- Cervical intraepithelial neoplasia 3 - CIN 3
- Squamous cell carcinoma - SCC
- Adenocarcinoma - AC
- No lesion
</details>
## Dataset Creation
### Curation Rationale
CCAgT was built to provide a dataset for machines to learn how to identify nucleus and nucleolus organizer regions (NORs).
### Source Data
#### Initial Data Collection and Normalization
The images are collected as patches/tiles of whole slide images (WSIs) from cervical samples stained with AgNOR technique to allow the detection of nucleolus organizer regions (NORs). NORs are DNA loops containing genes responsible for the transcription of ribosomal RNA located in the cell nucleolus. They contain a set of argyrophilic proteins, selectively stained by silver nitrate, which can be identified as black dots located throughout the nucleoli area and called AgNORs.
#### Who are the source language producers?
The dataset was built using images from examinations (a gynecological exam, colposcopy and biopsy) of 15 women patients who were treated at the Gynecology and Colposcopy Outpatient Clinic of the [University Hospital Professor Polydoro Ernani de São Thiago of Federal University of Santa Catarina (HU-UFSC)](https://unihospital.ufsc.br/) and had 6 different diagnoses in their oncological exams. The samples were collected by the members of the Clinical Analyses Department: Ane Francyne Costa, Fabiana Botelho De Miranda Onofre, and Alexandre Sherlley Casimiro Onofre.
### Annotations
#### Annotation process
The instances were annotated using the [labelbox](https://labelbox.com/) tool. The satellite category was labeled as a single dot, and the other categories were labeled as polygons. After the annotation process, all annotations were reviewed.
#### Who are the annotators?
Members of the Clinical Analyses Department and the Image Processing and Computer Graphics Lab. — LAPiX from [Universidade Federal de Santa Catarina (UFSC)](https://en.ufsc.br/).
- Tainee Bottamedi
- Vinícius Sanches
- João H. Telles de Carvalho
- Ricardo Thisted
### Personal and Sensitive Information
This research was approved by the UFSC Research Ethics Committee (CEPSH), protocol number 57423616.3.0000.0121. All involved patients were informed about the study's objectives, and those who agreed to participate signed an informed consent form.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset's purpose is to help spread the AgNOR as a support method for cancer diagnosis since this method is not standardized among pathologists.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
Satellite annotation is not as accurate for pixel-level representation due to single-point annotations.
## Additional Information
### Dataset Curators
Members of the Clinical Analyses Department from [Universidade Federal de Santa Catarina (UFSC)](https://en.ufsc.br/) collected the dataset samples: Ane Francyne Costa, Fabiana Botelho De Miranda Onofre, and Alexandre Sherlley Casimiro Onofre.
### Licensing Information
The files associated with this dataset are licensed under an [Attribution-NonCommercial 3.0 Unported](https://creativecommons.org/licenses/by-nc/3.0/) license.
Users are free to adapt, copy or redistribute the material as long as they attribute it appropriately and do not use it for commercial purposes.
### Citation Information
```bibtex
% Dataset oficial page
@misc{CCAgTDataset,
doi = {10.17632/WG4BPM33HJ.2},
url = {https://data.mendeley.com/datasets/wg4bpm33hj/2},
author = {Jo{\~{a}}o Gustavo Atkinson Amorim and Andr{\'{e}} Vict{\'{o}}ria Matias and Tainee Bottamedi and Vin{\'{i}}us Sanches and Ane Francyne Costa and Fabiana Botelho De Miranda Onofre and Alexandre Sherlley Casimiro Onofre and Aldo von Wangenheim},
title = {CCAgT: Images of Cervical Cells with AgNOR Stain Technique},
publisher = {Mendeley},
year = {2022},
copyright = {Attribution-NonCommercial 3.0 Unported}
}
% Dataset second version
% pre-print:
@article{AtkinsonAmorim2022,
doi = {10.2139/ssrn.4126881},
url = {https://doi.org/10.2139/ssrn.4126881},
year = {2022},
publisher = {Elsevier {BV}},
author = {Jo{\~{a}}o Gustavo Atkinson Amorim and Andr{\'{e}} Vict{\'{o}}ria Matias and Allan Cerentini and Fabiana Botelho de Miranda Onofre and Alexandre Sherlley Casimiro Onofre and Aldo von Wangenheim},
title = {Semantic Segmentation for the Detection of Very Small Objects on Cervical Cell Samples Stained with the {AgNOR} Technique},
journal = {{SSRN} Electronic Journal}
}
% Dataset first version
% Link: https://arquivos.ufsc.br/d/373be2177a33426a9e6c/
% Paper:
@inproceedings{AtkinsonSegmentationAgNORCBMS2020,
author={Jo{\~{a}}o Gustavo Atkinson Amorim and Luiz Antonio Buschetto Macarini and Andr{\'{e}} Vict{\'{o}}ria Matias and Allan Cerentini and Fabiana Botelho De Miranda Onofre and Alexandre Sherlley Casimiro Onofre and Aldo von Wangenheim},
booktitle={2020 IEEE 33rd International Symposium on Computer-Based Medical Systems (CBMS)},
title={A Novel Approach on Segmentation of AgNOR-Stained Cytology Images Using Deep Learning},
year={2020},
pages={552-557},
doi={10.1109/CBMS49503.2020.00110},
url={https://doi.org/10.1109/CBMS49503.2020.00110}
}
```
### Contributions
Thanks to [@johnnv1](https://github.com/johnnv1) for adding this dataset. |
communityai/communityai_apt-instruct-code-micro-50k | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 208257844.02012622
num_examples: 50000
download_size: 92527853
dataset_size: 208257844.02012622
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
abhishek/autotrain-data-cp4q-lpqu-u75r | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': daisy
'1': dandelion
'2': rose
'3': sunflower
'4': tulip
splits:
- name: train
num_bytes: 114410927.672
num_examples: 2196
- name: validation
num_bytes: 33682367.0
num_examples: 550
download_size: 166944003
dataset_size: 148093294.672
---
# Dataset Card for "autotrain-data-cp4q-lpqu-u75r"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andrewnoel/processed_light_dataset | ---
dataset_info:
features:
- name: scene
dtype: string
splits:
- name: train
num_bytes: 18625958.401499182
num_examples: 7684
- name: test
num_bytes: 2070089.5985008199
num_examples: 854
download_size: 11530553
dataset_size: 20696048.0
---
# Dataset Card for "processed_light_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nz0/lipe_model2 | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_sst2_bare_perfect | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 23489
num_examples: 160
- name: test
num_bytes: 56512
num_examples: 368
- name: train
num_bytes: 909578
num_examples: 8486
download_size: 576985
dataset_size: 989579
---
# Dataset Card for "MULTI_VALUE_sst2_bare_perfect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kasvii/face-partuv2beautifulluv-ffhq8-samples | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
- name: control_image
dtype: image
splits:
- name: train
num_bytes: 5773523.0
num_examples: 8
download_size: 3581792
dataset_size: 5773523.0
---
# Dataset Card for "face-partuv2beautifulluv-ffhq8-samples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
income/cqadupstack-wordpress-top-20-gen-queries | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
---
# NFCorpus: 20 generated queries (BEIR Benchmark)
This HF dataset contains the top-20 synthetic queries generated for each passage in the above BEIR benchmark dataset.
- DocT5query model used: [BeIR/query-gen-msmarco-t5-base-v1](https://huggingface.co/BeIR/query-gen-msmarco-t5-base-v1)
- id (str): unique document id in NFCorpus in the BEIR benchmark (`corpus.jsonl`).
- Questions generated: 20
- Code used for generation: [evaluate_anserini_docT5query_parallel.py](https://github.com/beir-cellar/beir/blob/main/examples/retrieval/evaluation/sparse/evaluate_anserini_docT5query_parallel.py)
Below contains the old dataset card for the BEIR benchmark.
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset.Top-20 generated queries for every passage in NFCorpus
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
pszemraj/boulderspot | ---
license: apache-2.0
size_categories:
- 10K<n<100K
task_categories:
- image-classification
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': bouldering_area
'1': other
- name: stem
dtype: string
- name: suffix
dtype: string
splits:
- name: train
num_bytes: 2332064084.761
num_examples: 13679
- name: test
num_bytes: 805144625.12
num_examples: 3880
download_size: 3137306204
dataset_size: 3137208709.881
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
tags:
- bouldering
- rock climbing
- Boulderspot
---
# pszemraj/boulderspot
These are aerial images of Switzerland classified into what could be a bouldering area (label: `bouldering_area`) or not (label: `other`). The test set has no labels (i.e. the column is `None`) and is randomly sampled from across the country.
Sources:
- data: [SWISSIMAGE 10 cm](https://www.swisstopo.admin.ch/en/orthoimage-swissimage-10)
- labels: me
Date created: 2021
You can find some example CNN-based models trained on an earlier/smaller version of this dataset in [this repo](https://github.com/pszemraj/BoulderAreaDetector)
If you are a member of **an organization** interested in details of how this was created/similar ideation related to AI applications for the outdoors/climbing, feel free to contact me (info on my [site](https://pszemraj.carrd.co/)).
---
```yml
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: string
- name: stem
dtype: string
- name: suffix
dtype: string
splits:
- name: train
num_bytes: 2881884752.62
num_examples: 13679
- name: test
num_bytes: 871464803.08
num_examples: 3880
download_size: 3137018901
dataset_size: 3753349555.7
``` |
biglam/illustrated_ads | ---
annotations_creators:
- expert-generated
language: []
language_creators: []
license:
- cc0-1.0
multilinguality: []
pretty_name: 19th Century United States Newspaper Advert images with 'illustrated'
or 'non illustrated' labels
size_categories:
- n<1K
source_datasets: []
tags:
- lam
- historic newspapers
task_categories:
- image-classification
task_ids:
- multi-class-image-classification
---
The Dataset contains images derived from the [Newspaper Navigator](https://news-navigator.labs.loc.gov/), a dataset of images drawn from the Library of Congress Chronicling America collection (chroniclingamerica.loc.gov/).
> [The Newspaper Navigator dataset](https://news-navigator.labs.loc.gov/) consists of extracted visual content for 16,358,041 historic newspaper pages in Chronicling America. The visual content was identified using an object detection model trained on annotations of World War 1-era Chronicling America pages, including annotations made by volunteers as part of the Beyond Words crowdsourcing project. source: https://news-navigator.labs.loc.gov/
One of these categories is 'advertisements'. This dataset contains a sample of these images with additional labels indicating if the advert is 'illustrated' or 'not illustrated'.
This dataset was created for use in a [Programming Historian tutorial](http://programminghistorian.github.io/ph-submissions/lessons/computer-vision-deep-learning-pt1). The primary aim of the data was to provide a realistic example dataset for teaching computer vision for working with digitised heritage material.
# Dataset Card for 19th Century United States Newspaper Advert images with 'illustrated' or 'non illustrated' labels
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**[https://doi.org/10.5281/zenodo.5838410](https://doi.org/10.5281/zenodo.5838410)
- **Paper:**[https://doi.org/10.46430/phen0101](https://doi.org/10.46430/phen0101)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The Dataset contains images derived from the [Newspaper Navigator](news-navigator.labs.loc.gov/), a dataset of images drawn from the Library of Congress Chronicling America collection (chroniclingamerica.loc.gov/).
> [The Newspaper Navigator dataset](https://news-navigator.labs.loc.gov/) consists of extracted visual content for 16,358,041 historic newspaper pages in Chronicling America. The visual content was identified using an object detection model trained on annotations of World War 1-era Chronicling America pages, including annotations made by volunteers as part of the Beyond Words crowdsourcing project. source: https://news-navigator.labs.loc.gov/
One of these categories is 'advertisements. This dataset contains a sample of these images with additional labels indicating if the advert is 'illustrated' or 'not illustrated'.
This dataset was created for use in a [Programming Historian tutorial](http://programminghistorian.github.io/ph-submissions/lessons/computer-vision-deep-learning-pt1). The primary aim of the data was to provide a realistic example dataset for teaching computer vision for working with digitised heritage material.
### Supported Tasks and Leaderboards
- `image-classification`: the primary purpose of this dataset is for classifying historic newspaper images identified as being 'advertisements' into 'illustrated' and 'not-illustrated' categories.
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
An example instance from this dataset
``` python
{'file': 'pst_fenske_ver02_data_sn84026497_00280776129_1880042101_0834_002_6_96.jpg',
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=L size=388x395 at 0x7F9A72038950>,
'label': 0,
'pub_date': Timestamp('1880-04-21 00:00:00'),
'page_seq_num': 834,
'edition_seq_num': 1,
'batch': 'pst_fenske_ver02',
'lccn': 'sn84026497',
'box': [0.649412214756012,
0.6045778393745422,
0.8002520799636841,
0.7152365446090698],
'score': 0.9609346985816956,
'ocr': "H. II. IIASLKT & SOXN, Dealers in General Merchandise In New Store Room nt HASLET'S COS ITERS, 'JTionoMtii, ln. .Tau'y 1st, 1?0.",
'place_of_publication': 'Tionesta, Pa.',
'geographic_coverage': "['Pennsylvania--Forest--Tionesta']",
'name': 'The Forest Republican. [volume]',
'publisher': 'Ed. W. Smiley',
'url': 'https://news-navigator.labs.loc.gov/data/pst_fenske_ver02/data/sn84026497/00280776129/1880042101/0834/002_6_96.jpg',
'page_url': 'https://chroniclingamerica.loc.gov/data/batches/pst_fenske_ver02/data/sn84026497/00280776129/1880042101/0834.jp2'}
```
### Data Fields
[More Information Needed]
### Data Splits
The dataset contains a single split.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
A description of the annotation process is outlined in this [GitHub repository](https://github.com/Living-with-machines/nnanno)
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
``` bibtex
@dataset{van_strien_daniel_2021_5838410,
author = {van Strien, Daniel},
title = {{19th Century United States Newspaper Advert images
with 'illustrated' or 'non illustrated' labels}},
month = oct,
year = 2021,
publisher = {Zenodo},
version = {0.0.1},
doi = {10.5281/zenodo.5838410},
url = {https://doi.org/10.5281/zenodo.5838410}}
```
[More Information Needed]
### Contributions
Thanks to [@davanstrien](https://github.com/davanstrien) for adding this dataset.
|
CyberHarem/komano_aunn_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of komano_aunn/高麗野あうん/코마노아운 (Touhou)
This is the dataset of komano_aunn/高麗野あうん/코마노아운 (Touhou), containing 500 images and their tags.
The core tags of this character are `horns, single_horn, long_hair, green_hair, curly_hair, green_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 461.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/komano_aunn_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 306.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/komano_aunn_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1027 | 596.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/komano_aunn_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 423.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/komano_aunn_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1027 | 776.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/komano_aunn_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/komano_aunn_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, blush, collared_shirt, looking_at_viewer, open_mouth, paw_pose, short_sleeves, solo, white_shorts, :d, buttons, cloud_print, fang, heart, very_long_hair |
| 1 | 10 |  |  |  |  |  | 1girl, cloud_print, collared_shirt, short_sleeves, simple_background, solo, white_background, white_shorts, looking_at_viewer, smile, paw_pose, open_mouth, blush, buttons |
| 2 | 7 |  |  |  |  |  | 1girl, blush, cloud_print, collared_shirt, open_mouth, short_sleeves, signature, smile, solo, white_background, white_shorts, full_body, simple_background, barefoot, fang |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | collared_shirt | looking_at_viewer | open_mouth | paw_pose | short_sleeves | solo | white_shorts | :d | buttons | cloud_print | fang | heart | very_long_hair | simple_background | white_background | smile | signature | full_body | barefoot |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-----------------|:--------------------|:-------------|:-----------|:----------------|:-------|:---------------|:-----|:----------|:--------------|:-------|:--------|:-----------------|:--------------------|:-------------------|:--------|:------------|:------------|:-----------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | X | | | | X | X | X | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | | X | | X | X | X | | | X | X | | | X | X | X | X | X | X |
|
stable-bias/prof_images_blip__SD_v2_random_seeds | ---
dataset_info:
features:
- name: images
dtype: image
- name: embeddings
sequence: float32
splits:
- name: paralegal
num_bytes: 7318486.0
num_examples: 210
- name: bartender
num_bytes: 9962460.0
num_examples: 210
- name: facilities_manager
num_bytes: 7289204.0
num_examples: 210
- name: accountant
num_bytes: 6909069.0
num_examples: 210
- name: graphic_designer
num_bytes: 7583565.0
num_examples: 210
- name: network_administrator
num_bytes: 7987215.0
num_examples: 210
- name: financial_manager
num_bytes: 6723858.0
num_examples: 210
- name: baker
num_bytes: 7612344.0
num_examples: 210
- name: security_guard
num_bytes: 7064225.0
num_examples: 210
- name: artist
num_bytes: 7371224.0
num_examples: 210
- name: author
num_bytes: 7756269.0
num_examples: 210
- name: printing_press_operator
num_bytes: 9471204.0
num_examples: 210
- name: public_relations_specialist
num_bytes: 6793885.0
num_examples: 210
- name: sheet_metal_worker
num_bytes: 8989830.0
num_examples: 210
- name: clergy
num_bytes: 6872330.0
num_examples: 210
- name: payroll_clerk
num_bytes: 7053041.0
num_examples: 210
- name: teller
num_bytes: 7069603.0
num_examples: 210
- name: real_estate_broker
num_bytes: 6834640.0
num_examples: 210
- name: customer_service_representative
num_bytes: 6559413.0
num_examples: 210
- name: painter
num_bytes: 7608853.0
num_examples: 210
- name: tractor_operator
num_bytes: 11327292.0
num_examples: 210
- name: dental_hygienist
num_bytes: 6442475.0
num_examples: 210
- name: industrial_engineer
num_bytes: 7953512.0
num_examples: 210
- name: electrician
num_bytes: 8211621.0
num_examples: 210
- name: head_cook
num_bytes: 6814586.0
num_examples: 210
- name: health_technician
num_bytes: 6619944.0
num_examples: 210
- name: carpet_installer
num_bytes: 9732036.0
num_examples: 210
- name: purchasing_agent
num_bytes: 7281241.0
num_examples: 210
- name: supervisor
num_bytes: 7259807.0
num_examples: 210
- name: civil_engineer
num_bytes: 7545036.0
num_examples: 210
- name: lawyer
num_bytes: 6932314.0
num_examples: 210
- name: language_pathologist
num_bytes: 8150292.0
num_examples: 210
- name: ceo
num_bytes: 6554129.0
num_examples: 210
- name: computer_support_specialist
num_bytes: 7234873.0
num_examples: 210
- name: postal_worker
num_bytes: 7301055.0
num_examples: 210
- name: mechanical_engineer
num_bytes: 8950764.0
num_examples: 210
- name: nursing_assistant
num_bytes: 6556593.0
num_examples: 210
- name: dentist
num_bytes: 6270843.0
num_examples: 210
- name: tutor
num_bytes: 7187052.0
num_examples: 210
- name: butcher
num_bytes: 9278949.0
num_examples: 210
- name: insurance_agent
num_bytes: 6681547.0
num_examples: 210
- name: courier
num_bytes: 7025670.0
num_examples: 210
- name: computer_programmer
num_bytes: 6942696.0
num_examples: 210
- name: truck_driver
num_bytes: 8172476.0
num_examples: 210
- name: mechanic
num_bytes: 8613675.0
num_examples: 210
- name: marketing_manager
num_bytes: 6926682.0
num_examples: 210
- name: sales_manager
num_bytes: 6745661.0
num_examples: 210
- name: correctional_officer
num_bytes: 6778508.0
num_examples: 210
- name: manager
num_bytes: 6888590.0
num_examples: 210
- name: underwriter
num_bytes: 6754765.0
num_examples: 210
- name: executive_assistant
num_bytes: 6952574.0
num_examples: 210
- name: designer
num_bytes: 7392282.0
num_examples: 210
- name: groundskeeper
num_bytes: 10560005.0
num_examples: 210
- name: mental_health_counselor
num_bytes: 7099182.0
num_examples: 210
- name: aerospace_engineer
num_bytes: 8135548.0
num_examples: 210
- name: taxi_driver
num_bytes: 8572478.0
num_examples: 210
- name: nurse
num_bytes: 5901924.0
num_examples: 210
- name: data_entry_keyer
num_bytes: 7313454.0
num_examples: 210
- name: musician
num_bytes: 7809608.0
num_examples: 210
- name: event_planner
num_bytes: 7802747.0
num_examples: 210
- name: writer
num_bytes: 7637301.0
num_examples: 210
- name: cook
num_bytes: 6985880.0
num_examples: 210
- name: welder
num_bytes: 9465455.0
num_examples: 210
- name: producer
num_bytes: 7228578.0
num_examples: 210
- name: hairdresser
num_bytes: 7603193.0
num_examples: 210
- name: farmer
num_bytes: 10706035.0
num_examples: 210
- name: construction_worker
num_bytes: 7380203.0
num_examples: 210
- name: air_conditioning_installer
num_bytes: 8662081.0
num_examples: 210
- name: electrical_engineer
num_bytes: 8480176.0
num_examples: 210
- name: occupational_therapist
num_bytes: 6649443.0
num_examples: 210
- name: career_counselor
num_bytes: 6763648.0
num_examples: 210
- name: interior_designer
num_bytes: 7636660.0
num_examples: 210
- name: jailer
num_bytes: 7590640.0
num_examples: 210
- name: office_clerk
num_bytes: 6884348.0
num_examples: 210
- name: market_research_analyst
num_bytes: 7437349.0
num_examples: 210
- name: laboratory_technician
num_bytes: 7008094.0
num_examples: 210
- name: social_assistant
num_bytes: 7170832.0
num_examples: 210
- name: medical_records_specialist
num_bytes: 7676823.0
num_examples: 210
- name: machinery_mechanic
num_bytes: 9304149.0
num_examples: 210
- name: police_officer
num_bytes: 7252930.0
num_examples: 210
- name: software_developer
num_bytes: 6701016.0
num_examples: 210
- name: clerk
num_bytes: 7695628.0
num_examples: 210
- name: salesperson
num_bytes: 7381322.0
num_examples: 210
- name: social_worker
num_bytes: 6872051.0
num_examples: 210
- name: director
num_bytes: 6816359.0
num_examples: 210
- name: fast_food_worker
num_bytes: 7514633.0
num_examples: 210
- name: singer
num_bytes: 7547454.0
num_examples: 210
- name: metal_worker
num_bytes: 9133547.0
num_examples: 210
- name: cleaner
num_bytes: 6968832.0
num_examples: 210
- name: computer_systems_analyst
num_bytes: 7765082.0
num_examples: 210
- name: dental_assistant
num_bytes: 6543175.0
num_examples: 210
- name: psychologist
num_bytes: 7111584.0
num_examples: 210
- name: machinist
num_bytes: 9150561.0
num_examples: 210
- name: therapist
num_bytes: 6625855.0
num_examples: 210
- name: veterinarian
num_bytes: 7112583.0
num_examples: 210
- name: teacher
num_bytes: 7225827.0
num_examples: 210
- name: architect
num_bytes: 7044691.0
num_examples: 210
- name: office_worker
num_bytes: 6827592.0
num_examples: 210
- name: drywall_installer
num_bytes: 6156113.0
num_examples: 210
- name: nutritionist
num_bytes: 8280362.0
num_examples: 210
- name: librarian
num_bytes: 9788648.0
num_examples: 210
- name: childcare_worker
num_bytes: 6785897.0
num_examples: 210
- name: school_bus_driver
num_bytes: 9425294.0
num_examples: 210
- name: file_clerk
num_bytes: 8158537.0
num_examples: 210
- name: logistician
num_bytes: 7505143.0
num_examples: 210
- name: scientist
num_bytes: 7256325.0
num_examples: 210
- name: teaching_assistant
num_bytes: 7336792.0
num_examples: 210
- name: radiologic_technician
num_bytes: 7086410.0
num_examples: 210
- name: manicurist
num_bytes: 6894697.0
num_examples: 210
- name: community_manager
num_bytes: 7589020.0
num_examples: 210
- name: carpenter
num_bytes: 8417470.0
num_examples: 210
- name: claims_appraiser
num_bytes: 7057174.0
num_examples: 210
- name: dispatcher
num_bytes: 7111905.0
num_examples: 210
- name: cashier
num_bytes: 8422908.0
num_examples: 210
- name: roofer
num_bytes: 8910783.0
num_examples: 210
- name: photographer
num_bytes: 7508323.0
num_examples: 210
- name: detective
num_bytes: 7606742.0
num_examples: 210
- name: financial_advisor
num_bytes: 6605338.0
num_examples: 210
- name: wholesale_buyer
num_bytes: 9320426.0
num_examples: 210
- name: it_specialist
num_bytes: 7201798.0
num_examples: 210
- name: pharmacy_technician
num_bytes: 8173939.0
num_examples: 210
- name: engineer
num_bytes: 7485900.0
num_examples: 210
- name: mover
num_bytes: 7409428.0
num_examples: 210
- name: plane_mechanic
num_bytes: 8697598.0
num_examples: 210
- name: interviewer
num_bytes: 6421369.0
num_examples: 210
- name: massage_therapist
num_bytes: 6439125.0
num_examples: 210
- name: dishwasher
num_bytes: 9661619.0
num_examples: 210
- name: fitness_instructor
num_bytes: 6832101.0
num_examples: 210
- name: credit_counselor
num_bytes: 6907573.0
num_examples: 210
- name: stocker
num_bytes: 9484149.0
num_examples: 210
- name: pharmacist
num_bytes: 8414409.0
num_examples: 210
- name: doctor
num_bytes: 6669475.0
num_examples: 210
- name: compliance_officer
num_bytes: 6578437.0
num_examples: 210
- name: aide
num_bytes: 6765586.0
num_examples: 210
- name: bus_driver
num_bytes: 8894973.0
num_examples: 210
- name: financial_analyst
num_bytes: 6659678.0
num_examples: 210
- name: receptionist
num_bytes: 6410167.0
num_examples: 210
- name: janitor
num_bytes: 7148774.0
num_examples: 210
- name: plumber
num_bytes: 7828285.0
num_examples: 210
- name: physical_therapist
num_bytes: 6675681.0
num_examples: 210
- name: inventory_clerk
num_bytes: 8559201.0
num_examples: 210
- name: firefighter
num_bytes: 8438408.0
num_examples: 210
- name: coach
num_bytes: 7342173.0
num_examples: 210
- name: maid
num_bytes: 6733909.0
num_examples: 210
- name: pilot
num_bytes: 7879490.0
num_examples: 210
- name: repair_worker
num_bytes: 7972885.0
num_examples: 210
download_size: 1160823534
dataset_size: 1107977251.0
---
# Dataset Card for "prof_images_blip__SD_v2_random_seeds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/random_letter_find_passage_train10_eval40_title | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 5693
num_examples: 60
- name: validation
num_bytes: 5089
num_examples: 40
download_size: 9845
dataset_size: 10782
---
# Dataset Card for "random_letter_find_passage_train10_eval40_title"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dtrejopizzo/GSM8K-2023 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_alexredna__Tukan-1.1B-Chat-reasoning-sft-COLA | ---
pretty_name: Evaluation run of alexredna/Tukan-1.1B-Chat-reasoning-sft-COLA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [alexredna/Tukan-1.1B-Chat-reasoning-sft-COLA](https://huggingface.co/alexredna/Tukan-1.1B-Chat-reasoning-sft-COLA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alexredna__Tukan-1.1B-Chat-reasoning-sft-COLA\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-22T23:29:11.286981](https://huggingface.co/datasets/open-llm-leaderboard/details_alexredna__Tukan-1.1B-Chat-reasoning-sft-COLA/blob/main/results_2024-01-22T23-29-11.286981.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2548927738380926,\n\
\ \"acc_stderr\": 0.03072430220180239,\n \"acc_norm\": 0.25609104148058187,\n\
\ \"acc_norm_stderr\": 0.03148388351888373,\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.3825474897236823,\n\
\ \"mc2_stderr\": 0.013853773787804245\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.30887372013651876,\n \"acc_stderr\": 0.013501770929344004,\n\
\ \"acc_norm\": 0.3412969283276451,\n \"acc_norm_stderr\": 0.01385583128749772\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4479187412865963,\n\
\ \"acc_stderr\": 0.004962638446395995,\n \"acc_norm\": 0.5977892850029874,\n\
\ \"acc_norm_stderr\": 0.004893418929918262\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17777777777777778,\n\
\ \"acc_stderr\": 0.033027898599017176,\n \"acc_norm\": 0.17777777777777778,\n\
\ \"acc_norm_stderr\": 0.033027898599017176\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.03197565821032499,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.03197565821032499\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.026341480371118362,\n\
\ \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.026341480371118362\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.23829787234042554,\n \"acc_stderr\": 0.02785125297388979,\n\
\ \"acc_norm\": 0.23829787234042554,\n \"acc_norm_stderr\": 0.02785125297388979\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727771,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727771\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2328042328042328,\n \"acc_stderr\": 0.02176596167215453,\n \"\
acc_norm\": 0.2328042328042328,\n \"acc_norm_stderr\": 0.02176596167215453\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1935483870967742,\n\
\ \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.1935483870967742,\n\
\ \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.21182266009852216,\n \"acc_stderr\": 0.02874898368994106,\n\
\ \"acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.02874898368994106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.03317505930009179,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.03317505930009179\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20202020202020202,\n \"acc_stderr\": 0.02860620428922988,\n \"\
acc_norm\": 0.20202020202020202,\n \"acc_norm_stderr\": 0.02860620428922988\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24870466321243523,\n \"acc_stderr\": 0.03119584087770031,\n\
\ \"acc_norm\": 0.24870466321243523,\n \"acc_norm_stderr\": 0.03119584087770031\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.25384615384615383,\n \"acc_stderr\": 0.022066054378726257,\n\
\ \"acc_norm\": 0.25384615384615383,\n \"acc_norm_stderr\": 0.022066054378726257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844086,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844086\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.026653531596715494,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.026653531596715494\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436777,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436777\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23486238532110093,\n \"acc_stderr\": 0.018175110510343574,\n \"\
acc_norm\": 0.23486238532110093,\n \"acc_norm_stderr\": 0.018175110510343574\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.31223628691983124,\n \"acc_stderr\": 0.03016513786784701,\n \
\ \"acc_norm\": 0.31223628691983124,\n \"acc_norm_stderr\": 0.03016513786784701\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34977578475336324,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.34977578475336324,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.03880848301082396,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.03880848301082396\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.03259177392742177,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.03259177392742177\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384493,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384493\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.029343114798094476,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.029343114798094476\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27330779054916987,\n\
\ \"acc_stderr\": 0.01593668106262856,\n \"acc_norm\": 0.27330779054916987,\n\
\ \"acc_norm_stderr\": 0.01593668106262856\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.023445826276545546,\n\
\ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.023445826276545546\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.01428834380392531,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.01428834380392531\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023805186524888146,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023805186524888146\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n\
\ \"acc_stderr\": 0.02521804037341062,\n \"acc_norm\": 0.27009646302250806,\n\
\ \"acc_norm_stderr\": 0.02521804037341062\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.21631205673758866,\n \"acc_stderr\": 0.024561720560562803,\n \
\ \"acc_norm\": 0.21631205673758866,\n \"acc_norm_stderr\": 0.024561720560562803\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2405475880052151,\n\
\ \"acc_stderr\": 0.010916406735478949,\n \"acc_norm\": 0.2405475880052151,\n\
\ \"acc_norm_stderr\": 0.010916406735478949\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.025767252010855966,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.025767252010855966\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2647058823529412,\n \"acc_stderr\": 0.01784808957491323,\n \
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.01784808957491323\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.16326530612244897,\n \"acc_stderr\": 0.02366169917709862,\n\
\ \"acc_norm\": 0.16326530612244897,\n \"acc_norm_stderr\": 0.02366169917709862\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3433734939759036,\n\
\ \"acc_stderr\": 0.03696584317010601,\n \"acc_norm\": 0.3433734939759036,\n\
\ \"acc_norm_stderr\": 0.03696584317010601\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.034678266857038266,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.034678266857038266\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.3825474897236823,\n\
\ \"mc2_stderr\": 0.013853773787804245\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6077348066298343,\n \"acc_stderr\": 0.013722400462000885\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.013646702047005308,\n \
\ \"acc_stderr\": 0.003195747075480784\n }\n}\n```"
repo_url: https://huggingface.co/alexredna/Tukan-1.1B-Chat-reasoning-sft-COLA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|arc:challenge|25_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|gsm8k|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hellaswag|10_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T23-29-11.286981.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T23-29-11.286981.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- '**/details_harness|winogrande|5_2024-01-22T23-29-11.286981.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-22T23-29-11.286981.parquet'
- config_name: results
data_files:
- split: 2024_01_22T23_29_11.286981
path:
- results_2024-01-22T23-29-11.286981.parquet
- split: latest
path:
- results_2024-01-22T23-29-11.286981.parquet
---
# Dataset Card for Evaluation run of alexredna/Tukan-1.1B-Chat-reasoning-sft-COLA
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alexredna/Tukan-1.1B-Chat-reasoning-sft-COLA](https://huggingface.co/alexredna/Tukan-1.1B-Chat-reasoning-sft-COLA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alexredna__Tukan-1.1B-Chat-reasoning-sft-COLA",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T23:29:11.286981](https://huggingface.co/datasets/open-llm-leaderboard/details_alexredna__Tukan-1.1B-Chat-reasoning-sft-COLA/blob/main/results_2024-01-22T23-29-11.286981.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2548927738380926,
"acc_stderr": 0.03072430220180239,
"acc_norm": 0.25609104148058187,
"acc_norm_stderr": 0.03148388351888373,
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.3825474897236823,
"mc2_stderr": 0.013853773787804245
},
"harness|arc:challenge|25": {
"acc": 0.30887372013651876,
"acc_stderr": 0.013501770929344004,
"acc_norm": 0.3412969283276451,
"acc_norm_stderr": 0.01385583128749772
},
"harness|hellaswag|10": {
"acc": 0.4479187412865963,
"acc_stderr": 0.004962638446395995,
"acc_norm": 0.5977892850029874,
"acc_norm_stderr": 0.004893418929918262
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17777777777777778,
"acc_stderr": 0.033027898599017176,
"acc_norm": 0.17777777777777778,
"acc_norm_stderr": 0.033027898599017176
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.03197565821032499,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.03197565821032499
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.026341480371118362,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.026341480371118362
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23829787234042554,
"acc_stderr": 0.02785125297388979,
"acc_norm": 0.23829787234042554,
"acc_norm_stderr": 0.02785125297388979
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2328042328042328,
"acc_stderr": 0.02176596167215453,
"acc_norm": 0.2328042328042328,
"acc_norm_stderr": 0.02176596167215453
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1935483870967742,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.1935483870967742,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21182266009852216,
"acc_stderr": 0.02874898368994106,
"acc_norm": 0.21182266009852216,
"acc_norm_stderr": 0.02874898368994106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.03317505930009179,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.03317505930009179
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20202020202020202,
"acc_stderr": 0.02860620428922988,
"acc_norm": 0.20202020202020202,
"acc_norm_stderr": 0.02860620428922988
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24870466321243523,
"acc_stderr": 0.03119584087770031,
"acc_norm": 0.24870466321243523,
"acc_norm_stderr": 0.03119584087770031
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.25384615384615383,
"acc_stderr": 0.022066054378726257,
"acc_norm": 0.25384615384615383,
"acc_norm_stderr": 0.022066054378726257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844086,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844086
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.026653531596715494,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.026653531596715494
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436777,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436777
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23486238532110093,
"acc_stderr": 0.018175110510343574,
"acc_norm": 0.23486238532110093,
"acc_norm_stderr": 0.018175110510343574
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.31223628691983124,
"acc_stderr": 0.03016513786784701,
"acc_norm": 0.31223628691983124,
"acc_norm_stderr": 0.03016513786784701
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34977578475336324,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.34977578475336324,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.03259177392742177,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.03259177392742177
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.029343114798094476,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.029343114798094476
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.27330779054916987,
"acc_stderr": 0.01593668106262856,
"acc_norm": 0.27330779054916987,
"acc_norm_stderr": 0.01593668106262856
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.023445826276545546,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.023445826276545546
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.01428834380392531,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.01428834380392531
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023805186524888146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023805186524888146
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.02521804037341062,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.02521804037341062
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.21631205673758866,
"acc_stderr": 0.024561720560562803,
"acc_norm": 0.21631205673758866,
"acc_norm_stderr": 0.024561720560562803
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2405475880052151,
"acc_stderr": 0.010916406735478949,
"acc_norm": 0.2405475880052151,
"acc_norm_stderr": 0.010916406735478949
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.025767252010855966,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.025767252010855966
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.01784808957491323,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.01784808957491323
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.16326530612244897,
"acc_stderr": 0.02366169917709862,
"acc_norm": 0.16326530612244897,
"acc_norm_stderr": 0.02366169917709862
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3433734939759036,
"acc_stderr": 0.03696584317010601,
"acc_norm": 0.3433734939759036,
"acc_norm_stderr": 0.03696584317010601
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.3825474897236823,
"mc2_stderr": 0.013853773787804245
},
"harness|winogrande|5": {
"acc": 0.6077348066298343,
"acc_stderr": 0.013722400462000885
},
"harness|gsm8k|5": {
"acc": 0.013646702047005308,
"acc_stderr": 0.003195747075480784
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
freshpearYoon/train_free_36 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604560288
num_examples: 10000
download_size: 1215178299
dataset_size: 9604560288
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sanaeai/ar_tweets1 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 7864711
num_examples: 46452
download_size: 3336933
dataset_size: 7864711
---
# Dataset Card for "ar-tweets1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dhmeltzer/ask-science-qg | ---
dataset_info:
features:
- name: title
dtype: string
- name: selftext
dtype: string
- name: answers.text
dtype: string
- name: answers.score
dtype: int64
- name: title_urls.url
sequence: string
- name: selftext_urls.url
sequence: string
- name: answers_urls.url
sequence: string
- name: sent_vec
sequence: float32
splits:
- name: train
num_bytes: 359548558
num_examples: 125323
- name: validation
num_bytes: 6458617
num_examples: 2060
- name: test
num_bytes: 12454869
num_examples: 4058
download_size: 390790124
dataset_size: 378462044
---
# Dataset Card for "ask-science-qg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huggingartists/david-bowie | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/david-bowie"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 1.590408 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/2eea1354199a1914d947041259d25dc4.678x678x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/david-bowie">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">David Bowie</div>
<a href="https://genius.com/artists/david-bowie">
<div style="text-align: center; font-size: 14px;">@david-bowie</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/david-bowie).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/david-bowie")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|1302| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/david-bowie")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
emre/llama-2-instruct-121k-code | ---
dataset_info:
features:
- name: llamaV2Instruct
dtype: string
splits:
- name: train
num_bytes: 87634976
num_examples: 121959
download_size: 36997092
dataset_size: 87634976
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llamaV2Instruct-121k-code"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-glue-mnli-026a6e-14686020 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: natural_language_inference
model: nbhimte/tiny-bert-mnli-distilled
metrics: []
dataset_name: glue
dataset_config: mnli
dataset_split: validation_matched
col_mapping:
text1: premise
text2: hypothesis
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Natural Language Inference
* Model: nbhimte/tiny-bert-mnli-distilled
* Dataset: glue
* Config: mnli
* Split: validation_matched
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
argilla/llama-2-banking-fine-tune | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for llama-2-banking-fine-tune
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("argilla/llama-2-banking-fine-tune")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("argilla/llama-2-banking-fine-tune")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/guides/llms/conceptual_guides/data_model.html) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| request | User request | TextField | True | False |
| response-1 | Assistant response 1 | TextField | True | True |
| response-2 | Assistant response 2 | TextField | True | True |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, single choice, or multiple choice.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| preference | Order response based on your preference | RankingQuestion | True | 1 = most accurate and helpful, 2 = worst. Ties are allowed. | ['response-1', 'response-2'] |
| correct-response | If both responses are innacurate, complete the response: | TextQuestion | True | N/A | N/A |
**✨ NEW** Additionally, we also have **suggestions**, which are linked to the existing questions, and so on, named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above.
Finally, the **guidelines** are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"request": "I tried to make a transfer but it failed",
"response-1": " Thank you for choosing the Bank of America for your financial needs! I\u0027m so sorry to hear that your transfer failed. Can you please provide me with more details about the transfer, such as the amount and the account it was being transferred from/to? This will help me better understand the issue and assist you in resolving it as quickly as possible. Additionally, have you tried checking your account balance or recent transactions to see if the transfer has been processed? If you have any further questions or concerns, please don\u0027t hesitate to ask. Thank you for trusting the Bank of America with your financial matters!",
"response-2": " Hello there! Thank you for choosing the Bank of America for your financial needs. I\u0027m so sorry to hear that your transfer failed. Can you please provide me with more details about the transfer, such as the amount and the account it was being transferred from/to? Additionally, do you have any error messages or confirmations that you received after attempting the transfer? This information will help me better understand the issue and assist you in resolving it as quickly as possible. Thank you for trusting the Bank of America with your financial matters."
},
"id": null,
"metadata": {},
"responses": [],
"suggestions": [
{
"agent": null,
"question_id": "b80fb550-1add-4ad6-93c9-b403e6342306",
"question_name": "preference",
"score": null,
"type": null,
"value": [
{
"rank": 1,
"value": "response-2"
},
{
"rank": 2,
"value": "response-1"
}
]
}
]
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"correct-response": null,
"correct-response-suggestion": null,
"correct-response-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"external_id": null,
"metadata": null,
"preference": null,
"preference-suggestion": {
"rank": [
1,
2
],
"value": [
"response-2",
"response-1"
]
},
"preference-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"request": "I tried to make a transfer but it failed",
"response-1": " Thank you for choosing the Bank of America for your financial needs! I\u0027m so sorry to hear that your transfer failed. Can you please provide me with more details about the transfer, such as the amount and the account it was being transferred from/to? This will help me better understand the issue and assist you in resolving it as quickly as possible. Additionally, have you tried checking your account balance or recent transactions to see if the transfer has been processed? If you have any further questions or concerns, please don\u0027t hesitate to ask. Thank you for trusting the Bank of America with your financial matters!",
"response-2": " Hello there! Thank you for choosing the Bank of America for your financial needs. I\u0027m so sorry to hear that your transfer failed. Can you please provide me with more details about the transfer, such as the amount and the account it was being transferred from/to? Additionally, do you have any error messages or confirmations that you received after attempting the transfer? This information will help me better understand the issue and assist you in resolving it as quickly as possible. Thank you for trusting the Bank of America with your financial matters."
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
* **request** is of type `TextField`.
* **response-1** is of type `TextField`.
* **response-2** is of type `TextField`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **preference** is of type `RankingQuestion` with the following allowed values ['response-1', 'response-2'], and description "1 = most accurate and helpful, 2 = worst. Ties are allowed.".
* (optional) **correct-response** is of type `TextQuestion`.
* **✨ NEW** **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **preference-suggestion** is of type `ranking` with the following allowed values ['response-1', 'response-2'].
* (optional) **correct-response-suggestion** is of type `text`.
Additionally, we also have one more field which is optional and is the following:
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
Please, read the question carefully and try to answer it as accurately as possible.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CVasNLPExperiments/Food101_test_google_flan_t5_small_mode_T_SPECIFIC_A_ns_10000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_descriptors_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 4198605
num_examples: 10000
download_size: 506659
dataset_size: 4198605
---
# Dataset Card for "Food101_test_google_flan_t5_small_mode_T_SPECIFIC_A_ns_10000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_lmsys__vicuna-7b-v1.5-16k | ---
pretty_name: Evaluation run of lmsys/vicuna-7b-v1.5-16k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lmsys/vicuna-7b-v1.5-16k](https://huggingface.co/lmsys/vicuna-7b-v1.5-16k) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lmsys__vicuna-7b-v1.5-16k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-21T18:19:50.042065](https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-7b-v1.5-16k/blob/main/results_2023-10-21T18-19-50.042065.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.008913590604026845,\n\
\ \"em_stderr\": 0.0009625465757536675,\n \"f1\": 0.07239303691275177,\n\
\ \"f1_stderr\": 0.001693138759317829,\n \"acc\": 0.38701199696261246,\n\
\ \"acc_stderr\": 0.00973738194322199\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.008913590604026845,\n \"em_stderr\": 0.0009625465757536675,\n\
\ \"f1\": 0.07239303691275177,\n \"f1_stderr\": 0.001693138759317829\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06368460955269144,\n \
\ \"acc_stderr\": 0.006726213078805721\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7103393843725335,\n \"acc_stderr\": 0.012748550807638257\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lmsys/vicuna-7b-v1.5-16k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|arc:challenge|25_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T11_48_14.833652
path:
- '**/details_harness|drop|3_2023-10-19T11-48-14.833652.parquet'
- split: 2023_10_21T18_19_50.042065
path:
- '**/details_harness|drop|3_2023-10-21T18-19-50.042065.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-21T18-19-50.042065.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T11_48_14.833652
path:
- '**/details_harness|gsm8k|5_2023-10-19T11-48-14.833652.parquet'
- split: 2023_10_21T18_19_50.042065
path:
- '**/details_harness|gsm8k|5_2023-10-21T18-19-50.042065.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-21T18-19-50.042065.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hellaswag|10_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:58:23.659880.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T07:58:23.659880.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T07:58:23.659880.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T11_48_14.833652
path:
- '**/details_harness|winogrande|5_2023-10-19T11-48-14.833652.parquet'
- split: 2023_10_21T18_19_50.042065
path:
- '**/details_harness|winogrande|5_2023-10-21T18-19-50.042065.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-21T18-19-50.042065.parquet'
- config_name: results
data_files:
- split: 2023_08_18T07_58_23.659880
path:
- results_2023-08-18T07:58:23.659880.parquet
- split: 2023_10_19T11_48_14.833652
path:
- results_2023-10-19T11-48-14.833652.parquet
- split: 2023_10_21T18_19_50.042065
path:
- results_2023-10-21T18-19-50.042065.parquet
- split: latest
path:
- results_2023-10-21T18-19-50.042065.parquet
---
# Dataset Card for Evaluation run of lmsys/vicuna-7b-v1.5-16k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lmsys/vicuna-7b-v1.5-16k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lmsys/vicuna-7b-v1.5-16k](https://huggingface.co/lmsys/vicuna-7b-v1.5-16k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lmsys__vicuna-7b-v1.5-16k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T18:19:50.042065](https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-7b-v1.5-16k/blob/main/results_2023-10-21T18-19-50.042065.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.008913590604026845,
"em_stderr": 0.0009625465757536675,
"f1": 0.07239303691275177,
"f1_stderr": 0.001693138759317829,
"acc": 0.38701199696261246,
"acc_stderr": 0.00973738194322199
},
"harness|drop|3": {
"em": 0.008913590604026845,
"em_stderr": 0.0009625465757536675,
"f1": 0.07239303691275177,
"f1_stderr": 0.001693138759317829
},
"harness|gsm8k|5": {
"acc": 0.06368460955269144,
"acc_stderr": 0.006726213078805721
},
"harness|winogrande|5": {
"acc": 0.7103393843725335,
"acc_stderr": 0.012748550807638257
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
quocanh34/data_for_synthesis_filtered | ---
dataset_info:
features:
- name: id
dtype: string
- name: sentence
dtype: string
- name: intent
dtype: string
- name: sentence_annotation
dtype: string
- name: entities
list:
- name: type
dtype: string
- name: filler
dtype: string
- name: file
dtype: string
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: origin_transcription
dtype: string
- name: sentence_norm
dtype: string
- name: w2v2_large_transcription
dtype: string
- name: wer
dtype: int64
splits:
- name: train
num_bytes: 859642543.031654
num_examples: 1660
download_size: 191939150
dataset_size: 859642543.031654
---
# Dataset Card for "data_for_synthesis_filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ana85baul/autotrain-data-shapes | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: shapes
## Dataset Description
This dataset has been automatically processed by AutoTrain for project shapes.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<28x28 RGB PIL image>",
"target": 0
},
{
"image": "<28x28 RGB PIL image>",
"target": 2
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(num_classes=3, names=['circles', 'squares', 'triangles'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 240 |
| valid | 60 |
|
xzuyn/lima-alpaca | ---
language:
- en
size_categories:
- 1K<n<10K
---
[Original Dataset by Meta AI](https://huggingface.co/datasets/GAIR/lima)
[LIMA: Less Is More Alignment](https://arxiv.org/abs/2305.11206) |
tr416/dataset_20231006_234427 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 74235
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231006_234427"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_001 | ---
pretty_name: Evaluation run of Lvxy1117/amber_fine_tune_001
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Lvxy1117/amber_fine_tune_001](https://huggingface.co/Lvxy1117/amber_fine_tune_001)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_001\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-28T13:46:59.201897](https://huggingface.co/datasets/open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_001/blob/main/results_2024-01-28T13-46-59.201897.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.31138101151746983,\n\
\ \"acc_stderr\": 0.0325934760300863,\n \"acc_norm\": 0.3139371329907083,\n\
\ \"acc_norm_stderr\": 0.033375512329911525,\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.015846315101394805,\n \"mc2\": 0.429338384075007,\n\
\ \"mc2_stderr\": 0.015517791037983605\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4121160409556314,\n \"acc_stderr\": 0.014383915302225398,\n\
\ \"acc_norm\": 0.44795221843003413,\n \"acc_norm_stderr\": 0.01453201149821167\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5683130850428202,\n\
\ \"acc_stderr\": 0.0049429906231311166,\n \"acc_norm\": 0.7378012348137821,\n\
\ \"acc_norm_stderr\": 0.004389312748012154\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.03944624162501117,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.03944624162501117\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137282,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137282\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3660377358490566,\n \"acc_stderr\": 0.02964781353936525,\n\
\ \"acc_norm\": 0.3660377358490566,\n \"acc_norm_stderr\": 0.02964781353936525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3055555555555556,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.3055555555555556,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.34104046242774566,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.34104046242774566,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.03793281185307812,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.03793281185307812\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.030472973363380045,\n\
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.030472973363380045\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.31724137931034485,\n \"acc_stderr\": 0.03878352372138622,\n\
\ \"acc_norm\": 0.31724137931034485,\n \"acc_norm_stderr\": 0.03878352372138622\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471276,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471276\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3064516129032258,\n\
\ \"acc_stderr\": 0.026226485652553887,\n \"acc_norm\": 0.3064516129032258,\n\
\ \"acc_norm_stderr\": 0.026226485652553887\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.21182266009852216,\n \"acc_stderr\": 0.028748983689941065,\n\
\ \"acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.028748983689941065\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.03567969772268049,\n\
\ \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.03567969772268049\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3383838383838384,\n \"acc_stderr\": 0.03371124142626302,\n \"\
acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.03371124142626302\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.32124352331606215,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.32124352331606215,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.023119362758232287,\n\
\ \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.023119362758232287\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31932773109243695,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.31932773109243695,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.19205298013245034,\n \"acc_stderr\": 0.03216298420593612,\n \"\
acc_norm\": 0.19205298013245034,\n \"acc_norm_stderr\": 0.03216298420593612\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.30275229357798167,\n \"acc_stderr\": 0.019698711434756353,\n \"\
acc_norm\": 0.30275229357798167,\n \"acc_norm_stderr\": 0.019698711434756353\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28431372549019607,\n \"acc_stderr\": 0.03166009679399812,\n \"\
acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.03166009679399812\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3206751054852321,\n \"acc_stderr\": 0.030381931949990417,\n \
\ \"acc_norm\": 0.3206751054852321,\n \"acc_norm_stderr\": 0.030381931949990417\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n\
\ \"acc_stderr\": 0.03273766725459156,\n \"acc_norm\": 0.3901345291479821,\n\
\ \"acc_norm_stderr\": 0.03273766725459156\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3816793893129771,\n \"acc_stderr\": 0.0426073515764456,\n\
\ \"acc_norm\": 0.3816793893129771,\n \"acc_norm_stderr\": 0.0426073515764456\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.040261875275912025,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.040261875275912025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326467,\n\
\ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326467\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.38461538461538464,\n\
\ \"acc_stderr\": 0.03187195347942466,\n \"acc_norm\": 0.38461538461538464,\n\
\ \"acc_norm_stderr\": 0.03187195347942466\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.36015325670498083,\n\
\ \"acc_stderr\": 0.01716636247136928,\n \"acc_norm\": 0.36015325670498083,\n\
\ \"acc_norm_stderr\": 0.01716636247136928\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.31213872832369943,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.31213872832369943,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2346368715083799,\n\
\ \"acc_stderr\": 0.014173044098303653,\n \"acc_norm\": 0.2346368715083799,\n\
\ \"acc_norm_stderr\": 0.014173044098303653\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3366013071895425,\n \"acc_stderr\": 0.027057974624494382,\n\
\ \"acc_norm\": 0.3366013071895425,\n \"acc_norm_stderr\": 0.027057974624494382\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.35691318327974275,\n\
\ \"acc_stderr\": 0.027210420375934033,\n \"acc_norm\": 0.35691318327974275,\n\
\ \"acc_norm_stderr\": 0.027210420375934033\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.30864197530864196,\n \"acc_stderr\": 0.025702640260603753,\n\
\ \"acc_norm\": 0.30864197530864196,\n \"acc_norm_stderr\": 0.025702640260603753\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590627,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590627\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.28748370273794005,\n\
\ \"acc_stderr\": 0.0115593373557085,\n \"acc_norm\": 0.28748370273794005,\n\
\ \"acc_norm_stderr\": 0.0115593373557085\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3104575163398693,\n \"acc_stderr\": 0.018718067052623227,\n \
\ \"acc_norm\": 0.3104575163398693,\n \"acc_norm_stderr\": 0.018718067052623227\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.028920583220675592,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.028920583220675592\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.03333333333333334,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.03333333333333334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n\
\ \"acc_stderr\": 0.03753267402120574,\n \"acc_norm\": 0.3674698795180723,\n\
\ \"acc_norm_stderr\": 0.03753267402120574\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.38011695906432746,\n \"acc_stderr\": 0.037229657413855394,\n\
\ \"acc_norm\": 0.38011695906432746,\n \"acc_norm_stderr\": 0.037229657413855394\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.015846315101394805,\n \"mc2\": 0.429338384075007,\n\
\ \"mc2_stderr\": 0.015517791037983605\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6408839779005525,\n \"acc_stderr\": 0.013483115202120234\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03639120545868082,\n \
\ \"acc_stderr\": 0.005158113489231189\n }\n}\n```"
repo_url: https://huggingface.co/Lvxy1117/amber_fine_tune_001
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|arc:challenge|25_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|gsm8k|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hellaswag|10_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T13-46-59.201897.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T13-46-59.201897.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- '**/details_harness|winogrande|5_2024-01-28T13-46-59.201897.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-28T13-46-59.201897.parquet'
- config_name: results
data_files:
- split: 2024_01_28T13_46_59.201897
path:
- results_2024-01-28T13-46-59.201897.parquet
- split: latest
path:
- results_2024-01-28T13-46-59.201897.parquet
---
# Dataset Card for Evaluation run of Lvxy1117/amber_fine_tune_001
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Lvxy1117/amber_fine_tune_001](https://huggingface.co/Lvxy1117/amber_fine_tune_001) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_001",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-28T13:46:59.201897](https://huggingface.co/datasets/open-llm-leaderboard/details_Lvxy1117__amber_fine_tune_001/blob/main/results_2024-01-28T13-46-59.201897.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.31138101151746983,
"acc_stderr": 0.0325934760300863,
"acc_norm": 0.3139371329907083,
"acc_norm_stderr": 0.033375512329911525,
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394805,
"mc2": 0.429338384075007,
"mc2_stderr": 0.015517791037983605
},
"harness|arc:challenge|25": {
"acc": 0.4121160409556314,
"acc_stderr": 0.014383915302225398,
"acc_norm": 0.44795221843003413,
"acc_norm_stderr": 0.01453201149821167
},
"harness|hellaswag|10": {
"acc": 0.5683130850428202,
"acc_stderr": 0.0049429906231311166,
"acc_norm": 0.7378012348137821,
"acc_norm_stderr": 0.004389312748012154
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03944624162501117,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03944624162501117
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137282,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137282
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3660377358490566,
"acc_stderr": 0.02964781353936525,
"acc_norm": 0.3660377358490566,
"acc_norm_stderr": 0.02964781353936525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.34104046242774566,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.34104046242774566,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.03793281185307812,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.03793281185307812
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.030472973363380045,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.030472973363380045
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.31724137931034485,
"acc_stderr": 0.03878352372138622,
"acc_norm": 0.31724137931034485,
"acc_norm_stderr": 0.03878352372138622
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471276,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471276
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3064516129032258,
"acc_stderr": 0.026226485652553887,
"acc_norm": 0.3064516129032258,
"acc_norm_stderr": 0.026226485652553887
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21182266009852216,
"acc_stderr": 0.028748983689941065,
"acc_norm": 0.21182266009852216,
"acc_norm_stderr": 0.028748983689941065
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.03567969772268049,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.03567969772268049
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3383838383838384,
"acc_stderr": 0.03371124142626302,
"acc_norm": 0.3383838383838384,
"acc_norm_stderr": 0.03371124142626302
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.32124352331606215,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.32124352331606215,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.023119362758232287,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.023119362758232287
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31932773109243695,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.31932773109243695,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.19205298013245034,
"acc_stderr": 0.03216298420593612,
"acc_norm": 0.19205298013245034,
"acc_norm_stderr": 0.03216298420593612
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.30275229357798167,
"acc_stderr": 0.019698711434756353,
"acc_norm": 0.30275229357798167,
"acc_norm_stderr": 0.019698711434756353
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.03166009679399812,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.03166009679399812
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3206751054852321,
"acc_stderr": 0.030381931949990417,
"acc_norm": 0.3206751054852321,
"acc_norm_stderr": 0.030381931949990417
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3901345291479821,
"acc_stderr": 0.03273766725459156,
"acc_norm": 0.3901345291479821,
"acc_norm_stderr": 0.03273766725459156
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3816793893129771,
"acc_stderr": 0.0426073515764456,
"acc_norm": 0.3816793893129771,
"acc_norm_stderr": 0.0426073515764456
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.040261875275912025,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.040261875275912025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326467,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326467
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.38461538461538464,
"acc_stderr": 0.03187195347942466,
"acc_norm": 0.38461538461538464,
"acc_norm_stderr": 0.03187195347942466
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.36015325670498083,
"acc_stderr": 0.01716636247136928,
"acc_norm": 0.36015325670498083,
"acc_norm_stderr": 0.01716636247136928
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2346368715083799,
"acc_stderr": 0.014173044098303653,
"acc_norm": 0.2346368715083799,
"acc_norm_stderr": 0.014173044098303653
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3366013071895425,
"acc_stderr": 0.027057974624494382,
"acc_norm": 0.3366013071895425,
"acc_norm_stderr": 0.027057974624494382
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.35691318327974275,
"acc_stderr": 0.027210420375934033,
"acc_norm": 0.35691318327974275,
"acc_norm_stderr": 0.027210420375934033
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.30864197530864196,
"acc_stderr": 0.025702640260603753,
"acc_norm": 0.30864197530864196,
"acc_norm_stderr": 0.025702640260603753
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590627,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590627
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.28748370273794005,
"acc_stderr": 0.0115593373557085,
"acc_norm": 0.28748370273794005,
"acc_norm_stderr": 0.0115593373557085
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3125,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3104575163398693,
"acc_stderr": 0.018718067052623227,
"acc_norm": 0.3104575163398693,
"acc_norm_stderr": 0.018718067052623227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.028920583220675592,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.028920583220675592
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03333333333333334,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03333333333333334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120574,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120574
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.38011695906432746,
"acc_stderr": 0.037229657413855394,
"acc_norm": 0.38011695906432746,
"acc_norm_stderr": 0.037229657413855394
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394805,
"mc2": 0.429338384075007,
"mc2_stderr": 0.015517791037983605
},
"harness|winogrande|5": {
"acc": 0.6408839779005525,
"acc_stderr": 0.013483115202120234
},
"harness|gsm8k|5": {
"acc": 0.03639120545868082,
"acc_stderr": 0.005158113489231189
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
clyu/sg_55k_cleaned_en_split8k | ---
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
dataset_info:
features:
- name: prompt_id
dtype: string
- name: prompt
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train_sft
num_bytes: 683175419.6903844
num_examples: 59881
- name: test_sft
num_bytes: 35960804.3096156
num_examples: 3152
download_size: 310298835
dataset_size: 719136224.0
---
# Dataset Card for "sg_55k_cleaned_en_split8k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arcee-ai/synthetic-data-gen | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: Title
dtype: string
- name: Abstract
dtype: string
- name: Question
dtype: string
- name: Answer
dtype: string
splits:
- name: train
num_bytes: 633145356
num_examples: 798682
- name: test
num_bytes: 158654392
num_examples: 200278
download_size: 398488431
dataset_size: 791799748
---
# Dataset Card for "synthetic-data-gen"
This is the synthetically generated dataset used for preliminary research results from [arcee's](https://www.arcee.ai/) open-source [DALM](https://github.com/arcee-ai/DALM/) repo,
implementing E2E Rag fine-tuning over a generator and retriever with cross-gradient propogation.
Implementation research from E2E Rag:
* TACL paper - https://direct.mit.edu/tacl/article/doi/10.1162/tacl_a_00530/114590/Improving-the-Domain-Adaptation-of-Retrieval
* Previous code - https://github.com/huggingface/transformers/blob/main/examples/research_projects/rag-end2end-retriever/README.md |
clarin-knext/fiqa-pl | ---
language:
- pl
---
Part of **BEIR-PL: Zero Shot Information Retrieval Benchmark for the Polish Language**.
Link to arxiv: https://arxiv.org/pdf/2305.19840.pdf
Contact: konrad.wojtasik@pwr.edu.pl |
senhorsapo/rad | ---
license: openrail
---
|
nccratliri/vad-zebra-finch | ---
license: apache-2.0
---
# Positive Transfer Of The Whisper Speech Transformer To Human And Animal Voice Activity Detection
We proposed WhisperSeg, utilizing the Whisper Transformer pre-trained for Automatic Speech Recognition (ASR) for both human and animal Voice Activity Detection (VAD). For more details, please refer to our paper
>
> [**Positive Transfer of the Whisper Speech Transformer to Human and Animal Voice Activity Detection**](https://doi.org/10.1101/2023.09.30.560270)
>
> Nianlong Gu, Kanghwi Lee, Maris Basha, Sumit Kumar Ram, Guanghao You, Richard H. R. Hahnloser <br>
> University of Zurich and ETH Zurich
This is the Zebra finch dataset customized for Animal Voice Activity Detection (vocal segmentation) in WhisperSeg.
## Download Dataset
```python
from huggingface_hub import snapshot_download
snapshot_download('nccratliri/vad-zebra-finch', local_dir = "data/zebra-finch", repo_type="dataset" )
```
For more usage details, please refer to the GitHub repository: https://github.com/nianlonggu/WhisperSeg
When using this dataset, please also cite:
```
@article {Tomka2023.09.04.555475,
author = {Tomas Tomka and Xinyu Hao and Aoxue Miao and Kanghwi Lee and Maris Basha and Stefan Reimann and Anja T Zai and Richard Hahnloser},
title = {Benchmarking nearest neighbor retrieval of zebra finch vocalizations across development},
elocation-id = {2023.09.04.555475},
year = {2023},
doi = {10.1101/2023.09.04.555475},
publisher = {Cold Spring Harbor Laboratory},
abstract = {Vocalizations are highly specialized motor gestures that regulate social interactions. The reliable detection of vocalizations from raw streams of microphone data remains an open problem even in research on widely studied animals such as the zebra finch. A promising method for finding vocal samples from potentially few labelled examples(templates) is nearest neighbor retrieval, but this method has never been extensively tested on vocal segmentation tasks. We retrieve zebra finch vocalizations as neighbors of each other in the sound spectrogram space. Based on merely 50 templates, we find excellent retrieval performance in adults (F1 score of 0.93 +/- 0.07) but not in juveniles (F1 score of 0.64 +/- 0.18), presumably due to the larger vocal variability of the latter. The performance in juveniles improves when retrieval is based on fixed-size template slices (F1 score of 0.72 +/- 0.10) instead of entire templates. Among the several distance metrics we tested such as the cosine and the Euclidean distance, we find that the Spearman distance largely outperforms all others. We release our expert-curated dataset of more than 50{\textquoteright}000 zebra finch vocal segments, which will enable training of data-hungry machine-learning approaches.Competing Interest StatementThe authors have declared no competing interest.},
URL = {https://www.biorxiv.org/content/early/2023/09/04/2023.09.04.555475},
eprint = {https://www.biorxiv.org/content/early/2023/09/04/2023.09.04.555475.full.pdf},
journal = {bioRxiv}
}
```
```
@article {Gu2023.09.30.560270,
author = {Nianlong Gu and Kanghwi Lee and Maris Basha and Sumit Kumar Ram and Guanghao You and Richard Hahnloser},
title = {Positive Transfer of the Whisper Speech Transformer to Human and Animal Voice Activity Detection},
elocation-id = {2023.09.30.560270},
year = {2023},
doi = {10.1101/2023.09.30.560270},
publisher = {Cold Spring Harbor Laboratory},
abstract = {This paper introduces WhisperSeg, utilizing the Whisper Transformer pre-trained for Automatic Speech Recognition (ASR) for human and animal Voice Activity Detection (VAD). Contrary to traditional methods that detect human voice or animal vocalizations from a short audio frame and rely on careful threshold selection, WhisperSeg processes entire spectrograms of long audio and generates plain text representations of onset, offset, and type of voice activity. Processing a longer audio context with a larger network greatly improves detection accuracy from few labeled examples. We further demonstrate a positive transfer of detection performance to new animal species, making our approach viable in the data-scarce multi-species setting.Competing Interest StatementThe authors have declared no competing interest.},
URL = {https://www.biorxiv.org/content/early/2023/10/02/2023.09.30.560270},
eprint = {https://www.biorxiv.org/content/early/2023/10/02/2023.09.30.560270.full.pdf},
journal = {bioRxiv}
}
```
## Contact
nianlong.gu@uzh.ch |
tyzhu/random_letter_same_length_find_passage_train10_eval40_title | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 22335
num_examples: 60
- name: validation
num_bytes: 16031
num_examples: 40
download_size: 32228
dataset_size: 38366
---
# Dataset Card for "random_letter_same_length_find_passage_train10_eval40_title"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cubpaw/voxelgym_demo | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
- name: rgb_label
dtype: image
splits:
- name: train
num_bytes: 70717.0
num_examples: 40
- name: validation
num_bytes: 17077.0
num_examples: 10
download_size: 79483
dataset_size: 87794.0
---
# Dataset Card for "voxelgym_demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_cola_my_i | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 2275
num_examples: 29
- name: test
num_bytes: 1996
num_examples: 24
- name: train
num_bytes: 12732
num_examples: 164
download_size: 14310
dataset_size: 17003
---
# Dataset Card for "MULTI_VALUE_cola_my_i"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
blanchon/OSCD_MSI | ---
language: en
license: unknown
task_categories:
- change-detection
paperswithcode_id: oscd
pretty_name: OSCD MSI
tags:
- remote-sensing
- earth-observation
- geospatial
- satellite-imagery
- change-detection
- sentinel-2
dataset_info:
features:
- name: image1
dtype:
array3_d:
dtype: uint16
shape:
- 13
- 10000
- 10000
- name: image2
dtype:
array3_d:
dtype: uint16
shape:
- 13
- 10000
- 10000
- name: mask
dtype: image
splits:
- name: train
num_bytes: 365977532.0
num_examples: 14
- name: test
num_bytes: 172930908.0
num_examples: 10
download_size: 397786113
dataset_size: 538908440.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# OSCD MSI
<!-- Dataset thumbnail -->

<!-- Provide a quick summary of the dataset. -->
The [Onera Satellite Change Detection](https://rcdaudt.github.io/oscd/) (OSCD) dataset is a change detection dataset that focuses on multispectral (MS) images taken by the [ESA Sentinel-2 satellite](https://sentinel.esa.int/web/sentinel/missions/sentinel-2). This dataset provides 24 registered image pairs from various continents, captured between the years 2015 and 2018. Along with the image pairs, the dataset includes binary change masks that highlight areas where changes have occurred. Pixel-level change ground truth is provided for all 14 training and 10 test image pairs. The annotated changes focus on urban changes, such as new buildings or new roads. These data can be used for training and setting parameters of change detection algorithms.
- **Paper:** https://arxiv.org/abs/1810.08468
- **Homepage:** https://github.com/phelber/OSCD
## Description
<!-- Provide a longer summary of what this dataset is. -->
The dataset is available in two versions: RGB only and all **13 MSI Bands** (this repo) [Multispectral (MS) Sentinel-2 bands](https://sentinels.copernicus.eu/web/sentinel/user-guides/sentinel-2-msi/resolutions/spatial).
- **Total Number of Images**: 24
- **Bands**: 13 (MSI)
- **Image Size**: 10000x10000
- **Image Resolution**: 10m
- **Land Cover Classes**: 2
- **Classes**: no change, change
- **Source**: Sentinel-2
## Usage
To use this dataset, simply use `datasets.load_dataset("blanchon/OSCD_MSI")`.
<!-- Provide any additional information on how to use this dataset. -->
```python
from datasets import load_dataset
OSCD_MSI = load_dataset("blanchon/OSCD_MSI")
```
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
If you use the EuroSAT dataset in your research, please consider citing the following publication:
```bibtex
@article{daudt2018urban,
title = {Urban Change Detection for Multispectral Earth Observation Using Convolutional Neural Networks},
author = {R. C. Daudt and B. L. Saux and Alexandre Boulch and Y. Gousseau},
journal = {IEEE International Geoscience and Remote Sensing Symposium},
year = {2018},
doi = {10.1109/IGARSS.2018.8518015},
bibSource = {Semantic Scholar https://www.semanticscholar.org/paper/18755de13d2d868255bbf332fa80ec218163c74e}
}
```
|
fairlabs/fairlabs-esg-sentiment-data-binary-raw | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 12419002.40555464
num_examples: 62224
- name: validation
num_bytes: 3104950.186796309
num_examples: 15557
- name: test
num_bytes: 199585.40764905245
num_examples: 1000
download_size: 8556782
dataset_size: 15723538.000000002
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
kgr123/quality_counter_1024_4_uniq | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 555190768
num_examples: 20000
- name: validation
num_bytes: 222148298
num_examples: 8000
- name: test
num_bytes: 56238906
num_examples: 2300
download_size: 26486684
dataset_size: 833577972
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
VatsaDev/SQUAD-Databricks | ---
license: apache-2.0
---
|
CyberHarem/stella_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of stella/スーテラ (Granblue Fantasy)
This is the dataset of stella/スーテラ (Granblue Fantasy), containing 55 images and their tags.
The core tags of this character are `animal_ears, short_hair, breasts, hair_ornament, mole, mole_under_mouth, grey_hair, bangs, medium_breasts, brown_hair, grey_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 55 | 49.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stella_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 55 | 35.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stella_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 114 | 65.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stella_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 55 | 45.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stella_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 114 | 81.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stella_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/stella_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, detached_sleeves, erune, looking_at_viewer, solo, arrow_(projectile), bow_(weapon), cape, holding_weapon, black_thighhighs, full_body, fur_trim, thigh_boots, ass, black_footwear, high_heel_boots, scarf, short_dress, zettai_ryouiki, backless_outfit, bare_shoulders, black_dress, from_behind, gloves, looking_back, quiver, smile, standing, white_background |
| 1 | 6 |  |  |  |  |  | 1girl, bare_shoulders, erune, looking_at_viewer, solo, collarbone, long_sleeves, choker, shorts, smile, bag, belt, boots, brown_eyes, cleavage, closed_mouth, detached_sleeves, frills, full_body, halterneck, hat, off_shoulder, simple_background, socks, standing, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | detached_sleeves | erune | looking_at_viewer | solo | arrow_(projectile) | bow_(weapon) | cape | holding_weapon | black_thighhighs | full_body | fur_trim | thigh_boots | ass | black_footwear | high_heel_boots | scarf | short_dress | zettai_ryouiki | backless_outfit | bare_shoulders | black_dress | from_behind | gloves | looking_back | quiver | smile | standing | white_background | collarbone | long_sleeves | choker | shorts | bag | belt | boots | brown_eyes | cleavage | closed_mouth | frills | halterneck | hat | off_shoulder | simple_background | socks |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:--------|:--------------------|:-------|:---------------------|:---------------|:-------|:-----------------|:-------------------|:------------|:-----------|:--------------|:------|:-----------------|:------------------|:--------|:--------------|:-----------------|:------------------|:-----------------|:--------------|:--------------|:---------|:---------------|:---------|:--------|:-----------|:-------------------|:-------------|:---------------|:---------|:---------|:------|:-------|:--------|:-------------|:-----------|:---------------|:---------|:-------------|:------|:---------------|:--------------------|:--------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | | | | | | X | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
gardner/tinyllama-function-calling-eval | ---
dataset_info:
features:
- name: prompt
dtype: string
id: field
- name: response1
dtype: string
id: field
- name: response2
dtype: string
id: field
splits:
- name: train
num_bytes: 2427817
num_examples: 1000
download_size: 949390
dataset_size: 2427817
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- en
---
# Not intended for training
This dataset is the result of an evaluation run on the model located here: [gardner/TinyLlama-1.1B-SlimOrca-Function-Calling-3T](https://huggingface.co/gardner/TinyLlama-1.1B-SlimOrca-Function-Calling-3T)
# Format
In this result set, `response1` is from the fine tuned model, and `response2` is from the test dataset. |
yyamada/SpatialEvalLLM | ---
license: mit
---
# SpatialEvalLLM Dataset
## Overview
The SpatialEvalLLM dataset is a collection of prompts with natural language descriptions of different geometries, including square grid (with variation of rhombus grid), rectangle grid, hexagonal grid, triangular grid, tree structure, and ring structure with different sizes. The dataset is designed for evaluating the performance of language models in navigating these spatial structures. Each prompt provides instructions regarding the starting point within the structure and directions for movement, along with the number of steps to take. The dataset facilitates benchmarking and assessing the ability of language models to understand and navigate spatial configurations.
## Contents
The dataset contains the following components:
### Folders:
map_global: Contains prompts where every prompt starts with a description of the entire structure.
map_local: Contains prompts where only partial information of the structure is given.
### File Naming Convention:
All files follow the format: type-[type]_size-[size]_steps-[steps]_seed-[seed]_n-[n].
type: Type of structure described in the prompts of the file.
size: Size of the structure.
steps: Number of navigation steps.
seed: Random seed number.
n: Number of prompts in the file.
### Prompt Structure:
Every prompt has two keys: "question" and "answer".
"question": The prompt itself.
"answer": The ground truth for the prompt.
Example File: type-ring_size-12_steps-8_seed-12_n-100 represents a file with 100 prompts of a ring structure with 12 nodes, 8 navigation steps asked to perform, generated using random seed 12.
## Usage
Researchers and developers working in natural language processing (NLP), machine learning, and artificial intelligence (AI) can utilize the SpatialEvalLLM dataset for:
- Training and evaluating large language models (LLMs) on spatial reasoning and navigation tasks.
- Benchmarking the performance of different NLP models in understanding and following spatial instructions.
- Investigating the capabilities and limitations of LLMs in navigating diverse spatial configurations.
If you wish to reproduce the dataset or generate more prompts with different sizes and navigation steps, the code used for data generation is available for download at https://github.com/runopti/SpatialEvalLLM, with instructions. You can use this code to generate custom prompts according to your specific requirements.
## Citation
If you use the SpatialEvalLLM dataset in your work, please cite the following paper:
```
@article{yamada2023evaluating,
title={Evaluating Spatial Understanding of Large Language Models},
author={Yamada, Yutaro and Bao, Yihan and Lampinen, Andrew K and Kasai, Jungo and Yildirim, Ilker},
journal={Transactions on Machine Learning Research},
year={2024}
}
```
## Contact
For any inquiries or issues regarding the dataset, please contact [yutaro.yamada@yale.edu, yihan.bao@yale.edu ]. |
SiguienteGlobal/herramientas | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: tool_calls
list:
- name: function
struct:
- name: arguments
dtype: string
- name: name
dtype: string
- name: id
dtype: string
- name: tools
list:
- name: function
struct:
- name: description
dtype: string
- name: name
dtype: string
- name: parameters
struct:
- name: properties
struct:
- name: cursor
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: data
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: domain
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: id
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: ip
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: limit
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: relationship
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: x-apikey
struct:
- name: description
dtype: string
- name: type
dtype: string
- name: required
sequence: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 536725
num_examples: 112
download_size: 35755
dataset_size: 536725
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
taskydata/tasky_or_not | ---
license: mit
task_categories:
- text-classification
language:
- en
size_categories:
- 10M<n<100M
--- |
zolak/twitter_dataset_50_1713226894 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 119548
num_examples: 318
download_size: 67185
dataset_size: 119548
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DanielDimas/t | ---
license: openrail
---
|
merve/turkish_instructions | ---
license: apache-2.0
---
|
juliaturc/rick-and-morty-manual-captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 11036008.0
num_examples: 151
- name: valid
num_bytes: 925318.0
num_examples: 16
download_size: 11931563
dataset_size: 11961326.0
---
# Dataset Card for "rick-and-morty-manual-captions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Quds/Qsh-da-msa | ---
license: openrail
---
---
TODO: Add YAML tags here. Copy-paste the tags obtained with the online tagging app: https://huggingface.co/spaces/huggingface/datasets-tagging
---
# Dataset Card for [Qsh-da-msa]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[translation]
### Languages
[Arabic to Arabic]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[dialect]
[MSA]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-Quds>) for adding this dataset. |
Jha-Pranav/Tokenized-TinyStories | ---
license: apache-2.0
---
|
autoevaluate/autoeval-staging-eval-project-cnn_dailymail-da2ad07e-10655435 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: patrickvonplaten/bert2bert_cnn_daily_mail
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: train
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: patrickvonplaten/bert2bert_cnn_daily_mail
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mumumumu](https://huggingface.co/mumumumu) for evaluating this model. |
AppleHarem/ranger_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ranger (Azur Lane)
This is the dataset of ranger (Azur Lane), containing 45 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 45 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 120 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 132 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 45 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 45 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 45 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 120 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 120 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 109 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 132 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 132 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
yotam56/hugo_dresses_ds | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Subfolder_1
'1': Subfolder_10
'2': Subfolder_11
'3': Subfolder_12
'4': Subfolder_13
'5': Subfolder_14
'6': Subfolder_15
'7': Subfolder_16
'8': Subfolder_17
'9': Subfolder_18
'10': Subfolder_19
'11': Subfolder_2
'12': Subfolder_20
'13': Subfolder_21
'14': Subfolder_22
'15': Subfolder_23
'16': Subfolder_24
'17': Subfolder_3
'18': Subfolder_4
'19': Subfolder_5
'20': Subfolder_6
'21': Subfolder_7
'22': Subfolder_8
'23': Subfolder_9
splits:
- name: train
num_bytes: 1193381.0
num_examples: 120
download_size: 0
dataset_size: 1193381.0
---
# Dataset Card for "hugo_dresses_ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kki11/ACOD-12K | ---
license: cc-by-nc-sa-4.0
---
|
result-kand2-sdxl-wuerst-karlo/96ca277a | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 173
num_examples: 10
download_size: 1332
dataset_size: 173
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "96ca277a"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
biglab/webui-350k | ---
license: other
---
This data accompanies the WebUI project (https://dl.acm.org/doi/abs/10.1145/3544548.3581158)
For more information, check out the project website: https://uimodeling.github.io/
To download this dataset, you need to install the huggingface-hub package
```
pip install huggingface-hub
```
Use snapshot_download
```
from huggingface_hub import snapshot_download
snapshot_download(repo_id="biglab/webui-350k", repo_type="dataset")
```
IMPORTANT
* Before downloading and using, please review the copyright info here: https://github.com/js0nwu/webui/blob/main/COPYRIGHT.txt
* Not all data samples have the same number of files (e.g., same number of device screenshots) due to the fact that the crawler used a timeout during collection
* The dataset released on HuggingFace was filtered using a list of explicit words and therefore contains fewer samples than the experiments originally used in the paper. The raw dataset is currently available (https://drive.google.com/drive/folders/1hcO75W2FjsZoibsj2TIbKz67hy9JkOBz?usp=share_link) but may be removed in the future. |
josiauhlol/autotrain-data-r2or-l87f-q4gb | ---
dataset_info:
features:
- name: output
dtype: string
- name: autotrain_text
dtype: string
splits:
- name: train
num_bytes: 757
num_examples: 14
- name: validation
num_bytes: 757
num_examples: 14
download_size: 4704
dataset_size: 1514
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "autotrain-data-r2or-l87f-q4gb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
timm/imagenet-w21-webp-wds | ---
license: other
license_name: imagenet
license_link: https://www.image-net.org/download.php
task_categories:
- image-classification
pretty_name: ImageNet-Winter21
size_categories:
- 10M<n<100M
extra_gated_prompt: >-
By clicking on “Access repository” below, you also agree to ImageNet Terms of
Access:
[RESEARCHER_FULLNAME] (the "Researcher") has requested permission to use the
ImageNet database (the "Database") at Princeton University and Stanford
University. In exchange for such permission, Researcher hereby agrees to the
following terms and conditions:
1. Researcher shall use the Database only for non-commercial research and
educational purposes.
2. Princeton University, Stanford University and Hugging Face make no
representations or warranties regarding the Database, including but not
limited to warranties of non-infringement or fitness for a particular purpose.
3. Researcher accepts full responsibility for his or her use of the Database
and shall defend and indemnify the ImageNet team, Princeton University,
Stanford University and Hugging Face, including their employees, Trustees,
officers and agents, against any and all claims arising from Researcher's use
of the Database, including but not limited to Researcher's use of any copies
of copyrighted images that he or she may create from the Database.
4. Researcher may provide research associates and colleagues with access to
the Database provided that they first agree to be bound by these terms and
conditions.
5. Princeton University, Stanford University and Hugging Face reserve the
right to terminate Researcher's access to the Database at any time.
6. If Researcher is employed by a for-profit, commercial entity, Researcher's
employer shall also be bound by these terms and conditions, and Researcher
hereby represents that he or she is fully authorized to enter into this
agreement on behalf of such employer.
7. The law of the State of New Jersey shall apply to all disputes under this
agreement.
tags:
- webdataset
---
## Dataset Description
- **Homepage:** https://image-net.org/index.php
- **Paper:** https://arxiv.org/abs/1409.0575
### Dataset Summary
This is a copy of the full `Winter21` release of ImageNet in webdataset tar format with WEBP encoded images. This release consists of 19167 classes, 2674 fewer classes than the original 21841 class `Fall11` release of the full ImageNet.
The classes were removed due to these concerns: https://www.image-net.org/update-sep-17-2019.php
This is the same contents as https://huggingface.co/datasets/timm/imagenet-w21-wds but encoded in webp at ~56% of the size, shard count halved.
### Data Splits
The full ImageNet dataset has no defined splits. This release follows that and leaves everything in the train split. Shards are shuffled so validation & test splits can be made by dividing at the shard level.
#### Train
* `imagenet12k-train-{0000..1023}.tar`
* 13151276 samples over 1024 shards
* 645.65 GB
### Processing
I performed some processing while sharding this dataset:
* All exif tags not related to color space were removed
* A set of 20 partially corrupted images in the original tar file were corrected and re-encoded
* All images with width or height < 32 were removed, ~2000 images.
* All images with the smallest edge > 768 were resized, maintaining aspect so that they were = 768. Improving size & decoding time uniformity for typical pretrain use cases.
* Images were re-encoded in WEBP
* Images were pre-shuffled across the shards
## Additional Information
### Dataset Curators
Authors of [[1]](https://arxiv.org/abs/1409.0575) and [[2]](https://ieeexplore.ieee.org/abstract/document/5206848):
- Olga Russakovsky
- Jia Deng
- Hao Su
- Jonathan Krause
- Sanjeev Satheesh
- Wei Dong
- Richard Socher
- Li-Jia Li
- Kai Li
- Sean Ma
- Zhiheng Huang
- Andrej Karpathy
- Aditya Khosla
- Michael Bernstein
- Alexander C Berg
- Li Fei-Fei
### Licensing Information
In exchange for permission to use the ImageNet database (the "Database") at Princeton University and Stanford University, Researcher hereby agrees to the following terms and conditions:
1. Researcher shall use the Database only for non-commercial research and educational purposes.
1. Princeton University and Stanford University make no representations or warranties regarding the Database, including but not limited to warranties of non-infringement or fitness for a particular purpose.
1. Researcher accepts full responsibility for his or her use of the Database and shall defend and indemnify the ImageNet team, Princeton University, and Stanford University, including their employees, Trustees, officers and agents, against any and all claims arising from Researcher's use of the Database, including but not limited to Researcher's use of any copies of copyrighted images that he or she may create from the Database.
1. Researcher may provide research associates and colleagues with access to the Database provided that they first agree to be bound by these terms and conditions.
1. Princeton University and Stanford University reserve the right to terminate Researcher's access to the Database at any time.
1. If Researcher is employed by a for-profit, commercial entity, Researcher's employer shall also be bound by these terms and conditions, and Researcher hereby represents that he or she is fully authorized to enter into this agreement on behalf of such employer.
1. The law of the State of New Jersey shall apply to all disputes under this agreement.
### Citation Information
```bibtex
@article{imagenet15russakovsky,
Author = {Olga Russakovsky and Jia Deng and Hao Su and Jonathan Krause and Sanjeev Satheesh and Sean Ma and Zhiheng Huang and Andrej Karpathy and Aditya Khosla and Michael Bernstein and Alexander C. Berg and Li Fei-Fei},
Title = { {ImageNet Large Scale Visual Recognition Challenge} },
Year = {2015},
journal = {International Journal of Computer Vision (IJCV)},
doi = {10.1007/s11263-015-0816-y},
volume={115},
number={3},
pages={211-252}
}
``` |
open-llm-leaderboard/details_freecs__Llama-3-7b | ---
pretty_name: Evaluation run of freecs/Llama-3-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [freecs/Llama-3-7b](https://huggingface.co/freecs/Llama-3-7b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_freecs__Llama-3-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-17T00:27:57.884190](https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__Llama-3-7b/blob/main/results_2024-01-17T00-27-57.884190.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.250629407659835,\n\
\ \"acc_stderr\": 0.030466481126384053,\n \"acc_norm\": 0.2521914286046078,\n\
\ \"acc_norm_stderr\": 0.031247424038738997,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871119,\n \"mc2\": 0.3803046918315385,\n\
\ \"mc2_stderr\": 0.014776905887343683\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.29266211604095566,\n \"acc_stderr\": 0.013295916103619418,\n\
\ \"acc_norm\": 0.3464163822525597,\n \"acc_norm_stderr\": 0.013905011180063251\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.42630950009958174,\n\
\ \"acc_stderr\": 0.004935291975579184,\n \"acc_norm\": 0.563931487751444,\n\
\ \"acc_norm_stderr\": 0.004948824501355487\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.037498507091740206,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.037498507091740206\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.16,\n \"acc_stderr\": 0.0368452949177471,\n \"acc_norm\"\
: 0.16,\n \"acc_norm_stderr\": 0.0368452949177471\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.0309528902177499,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.0309528902177499\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.0339549002085611,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.0339549002085611\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.25483870967741934,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20725388601036268,\n \"acc_stderr\": 0.02925282329180362,\n\
\ \"acc_norm\": 0.20725388601036268,\n \"acc_norm_stderr\": 0.02925282329180362\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2230769230769231,\n \"acc_stderr\": 0.021107730127243998,\n\
\ \"acc_norm\": 0.2230769230769231,\n \"acc_norm_stderr\": 0.021107730127243998\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868966,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868966\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436775,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23669724770642203,\n \"acc_stderr\": 0.01822407811729908,\n \"\
acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.01822407811729908\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16666666666666666,\n \"acc_stderr\": 0.025416428388767485,\n \"\
acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.025416428388767485\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083293,\n \"\
acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083293\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.37668161434977576,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.37668161434977576,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28735632183908044,\n\
\ \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.28735632183908044,\n\
\ \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912258,\n\
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912258\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090201,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090201\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2405475880052151,\n\
\ \"acc_stderr\": 0.010916406735478949,\n \"acc_norm\": 0.2405475880052151,\n\
\ \"acc_norm_stderr\": 0.010916406735478949\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.01766784161237899,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.01766784161237899\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17142857142857143,\n \"acc_stderr\": 0.02412746346265015,\n\
\ \"acc_norm\": 0.17142857142857143,\n \"acc_norm_stderr\": 0.02412746346265015\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871119,\n \"mc2\": 0.3803046918315385,\n\
\ \"mc2_stderr\": 0.014776905887343683\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5966850828729282,\n \"acc_stderr\": 0.013787257285896248\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \
\ \"acc_stderr\": 0.0016927007401501839\n }\n}\n```"
repo_url: https://huggingface.co/freecs/Llama-3-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|arc:challenge|25_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|gsm8k|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hellaswag|10_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T00-27-57.884190.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T00-27-57.884190.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- '**/details_harness|winogrande|5_2024-01-17T00-27-57.884190.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-17T00-27-57.884190.parquet'
- config_name: results
data_files:
- split: 2024_01_17T00_27_57.884190
path:
- results_2024-01-17T00-27-57.884190.parquet
- split: latest
path:
- results_2024-01-17T00-27-57.884190.parquet
---
# Dataset Card for Evaluation run of freecs/Llama-3-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [freecs/Llama-3-7b](https://huggingface.co/freecs/Llama-3-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_freecs__Llama-3-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T00:27:57.884190](https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__Llama-3-7b/blob/main/results_2024-01-17T00-27-57.884190.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.250629407659835,
"acc_stderr": 0.030466481126384053,
"acc_norm": 0.2521914286046078,
"acc_norm_stderr": 0.031247424038738997,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871119,
"mc2": 0.3803046918315385,
"mc2_stderr": 0.014776905887343683
},
"harness|arc:challenge|25": {
"acc": 0.29266211604095566,
"acc_stderr": 0.013295916103619418,
"acc_norm": 0.3464163822525597,
"acc_norm_stderr": 0.013905011180063251
},
"harness|hellaswag|10": {
"acc": 0.42630950009958174,
"acc_stderr": 0.004935291975579184,
"acc_norm": 0.563931487751444,
"acc_norm_stderr": 0.004948824501355487
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.037498507091740206,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.037498507091740206
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.16,
"acc_stderr": 0.0368452949177471,
"acc_norm": 0.16,
"acc_norm_stderr": 0.0368452949177471
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.0309528902177499,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.0309528902177499
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.0339549002085611,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.0339549002085611
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20725388601036268,
"acc_stderr": 0.02925282329180362,
"acc_norm": 0.20725388601036268,
"acc_norm_stderr": 0.02925282329180362
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2230769230769231,
"acc_stderr": 0.021107730127243998,
"acc_norm": 0.2230769230769231,
"acc_norm_stderr": 0.021107730127243998
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868966,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868966
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436775,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23669724770642203,
"acc_stderr": 0.01822407811729908,
"acc_norm": 0.23669724770642203,
"acc_norm_stderr": 0.01822407811729908
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.025416428388767485,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.025416428388767485
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.03132179803083293,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.03132179803083293
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.37668161434977576,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.37668161434977576,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.028605953702004253,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.028605953702004253
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28735632183908044,
"acc_stderr": 0.0161824107306827,
"acc_norm": 0.28735632183908044,
"acc_norm_stderr": 0.0161824107306827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090201,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090201
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2405475880052151,
"acc_stderr": 0.010916406735478949,
"acc_norm": 0.2405475880052151,
"acc_norm_stderr": 0.010916406735478949
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.01766784161237899,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.01766784161237899
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.34545454545454546,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.34545454545454546,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17142857142857143,
"acc_stderr": 0.02412746346265015,
"acc_norm": 0.17142857142857143,
"acc_norm_stderr": 0.02412746346265015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871119,
"mc2": 0.3803046918315385,
"mc2_stderr": 0.014776905887343683
},
"harness|winogrande|5": {
"acc": 0.5966850828729282,
"acc_stderr": 0.013787257285896248
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401501839
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/cynthia_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of cynthia (Fire Emblem)
This is the dataset of cynthia (Fire Emblem), containing 60 images and their tags.
The core tags of this character are `twintails, brown_hair, brown_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 60 | 64.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cynthia_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 60 | 36.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cynthia_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 130 | 74.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cynthia_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 60 | 57.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cynthia_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 130 | 105.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cynthia_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/cynthia_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | 1girl, smile, solo, open_mouth, gloves, looking_at_viewer, breastplate, blush, thighhighs, shoulder_armor, simple_background, garter_straps, weapon |
| 1 | 12 |  |  |  |  |  | 1girl, hetero, penis, blush, nipples, solo_focus, 1boy, thighhighs, open_mouth, sex_from_behind, vaginal, elbow_gloves, large_breasts, navel, nude, boots, cum_in_pussy, medium_breasts, spread_legs, uncensored |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | solo | open_mouth | gloves | looking_at_viewer | breastplate | blush | thighhighs | shoulder_armor | simple_background | garter_straps | weapon | hetero | penis | nipples | solo_focus | 1boy | sex_from_behind | vaginal | elbow_gloves | large_breasts | navel | nude | boots | cum_in_pussy | medium_breasts | spread_legs | uncensored |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:-------------|:---------|:--------------------|:--------------|:--------|:-------------|:-----------------|:--------------------|:----------------|:---------|:---------|:--------|:----------|:-------------|:-------|:------------------|:----------|:---------------|:----------------|:--------|:-------|:--------|:---------------|:-----------------|:--------------|:-------------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | | | X | | | | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/kinoshita_hinata_theidolmstermillionlive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kinoshita_hinata/木下ひなた (THE iDOLM@STER: Million Live!)
This is the dataset of kinoshita_hinata/木下ひなた (THE iDOLM@STER: Million Live!), containing 103 images and their tags.
The core tags of this character are `brown_hair, short_hair, ahoge, green_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 103 | 85.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kinoshita_hinata_theidolmstermillionlive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 103 | 64.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kinoshita_hinata_theidolmstermillionlive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 209 | 119.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kinoshita_hinata_theidolmstermillionlive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 103 | 81.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kinoshita_hinata_theidolmstermillionlive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 209 | 147.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kinoshita_hinata_theidolmstermillionlive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kinoshita_hinata_theidolmstermillionlive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, looking_at_viewer, open_mouth, blush, simple_background, white_background, :d, long_sleeves, skirt, white_shirt |
| 1 | 5 |  |  |  |  |  | 1girl, open_mouth, :d, blush, bow, puffy_short_sleeves, blue_eyes, flower, gloves, hairband, looking_at_viewer, solo, frills, mini_hat, red_dress, skirt |
| 2 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, open_mouth, solo, :d, bracelet, character_name, dress, blush, hair_bow, heart, necklace |
| 3 | 13 |  |  |  |  |  | 1girl, solo, blush, open_mouth, :d, looking_at_viewer, apple, ^_^, dress, holding_food |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | open_mouth | blush | simple_background | white_background | :d | long_sleeves | skirt | white_shirt | bow | puffy_short_sleeves | blue_eyes | flower | gloves | hairband | frills | mini_hat | red_dress | bracelet | character_name | dress | hair_bow | heart | necklace | apple | ^_^ | holding_food |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-------------|:--------|:--------------------|:-------------------|:-----|:---------------|:--------|:--------------|:------|:----------------------|:------------|:---------|:---------|:-----------|:---------|:-----------|:------------|:-----------|:-----------------|:--------|:-----------|:--------|:-----------|:--------|:------|:---------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | | X | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | | | X | | | | | | | | | | | | | X | X | X | X | X | X | | | |
| 3 | 13 |  |  |  |  |  | X | X | X | X | X | | | X | | | | | | | | | | | | | | | X | | | | X | X | X |
|
twdent/HikingHD | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 515009990.0
num_examples: 38
download_size: 159208907
dataset_size: 515009990.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "HikingHD"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-high_school_government_and_politics-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 6650
num_examples: 5
- name: test
num_bytes: 592819
num_examples: 193
download_size: 13885
dataset_size: 599469
---
# Dataset Card for "mmlu-high_school_government_and_politics-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hypervariance/function-calling-sharegpt | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- function-calling
- function-call
- functions
size_categories:
- 10K<n<100K
---
This is a dataset for finetuning models on function calling based on [glaiveai/glaive-function-calling-v2](https://huggingface.co/datasets/glaiveai/glaive-function-calling-v2).
The dataset includes 86,864 examples of chats that include function calling as part of the conversation. The system prompt includes either 0, 1, or 2 functions that the assistant can use, and instructions on how the agent can use it.
Changes include:
* Using ShareGPT format for chats
* Adding "function_response" as a role
* Removing code examples
* Removing examples with invalid JSON as function calls / responses
* Updating system message to include instructions on how to do function calls |
camendes/bible-nt-dataset | ---
license: mit
---
|
Nexdata/2769_People_CCTV_Re_ID_Data_in_Europe | ---
license: cc-by-nc-nd-4.0
---
## Description
2,769 People – CCTV Re-ID Data in Europe. The data includes males and females, the race distribution is Caucasian, black, Asian, and the age distribution is from children to the elderly. The data diversity includes different age groups, different time periods, different cameras, different human body orientations and postures. For annotation, the rectangular bounding boxes and 15 attributes of human body were annotated. This data can be used for re-id and other tasks.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1322?source=Huggingface
## Data size
2,769 people, 1-25 cameras for each person
## Population distribution
race distribution: 2,646 Caucasians, 47 Asians, 76 blacks; gender distribution: 1,091 males, 1,678 females; age distribution: mainly young and middle-aged
## Collecting environment
department store
## Data diversity
different age groups, different time periods, different cameras, different human body orientations and postures
## Device
surveillance cameras, the resolution includes 960*576 and 1,440*1,616
## Collecting angle
looking down angle
## Collecting time
10:00-20:00
## Data format
the image data format is .jpg or png, the annotation file format is .json
## Annotation content
human body rectangular bounding boxes, 15 human body attributes
## Accuracy rate
a rectangular bounding box of human body is qualified when the deviation is not more than 3 pixels, and the qualified rate of the bounding boxes shall not be lower than 97%; annotation accuracy of human attributes is over 97%; the accuracy of label annotation is not less than 97%
# Licensing Information
Commercial License
|
open-llm-leaderboard/details_Weyaxi__TekniumAiroboros-Nebula-7B | ---
pretty_name: Evaluation run of Weyaxi/TekniumAiroboros-Nebula-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/TekniumAiroboros-Nebula-7B](https://huggingface.co/Weyaxi/TekniumAiroboros-Nebula-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__TekniumAiroboros-Nebula-7B_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-08T17:19:18.874101](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__TekniumAiroboros-Nebula-7B_public/blob/main/results_2023-11-08T17-19-18.874101.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5486392579762343,\n\
\ \"acc_stderr\": 0.03374349666110138,\n \"acc_norm\": 0.5573176954338681,\n\
\ \"acc_norm_stderr\": 0.03452133425560909,\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.5163563481502564,\n\
\ \"mc2_stderr\": 0.015552871980185646,\n \"em\": 0.34721057046979864,\n\
\ \"em_stderr\": 0.004875539476141144,\n \"f1\": 0.4132676174496653,\n\
\ \"f1_stderr\": 0.004730745670924258\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5264505119453925,\n \"acc_stderr\": 0.01459093135812017,\n\
\ \"acc_norm\": 0.5716723549488054,\n \"acc_norm_stderr\": 0.014460496367599019\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6211909978092014,\n\
\ \"acc_stderr\": 0.004840990593494692,\n \"acc_norm\": 0.817167894841665,\n\
\ \"acc_norm_stderr\": 0.0038573886135331004\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309174,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309174\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.030052580579557845,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.030052580579557845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.047609522856952344,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.047609522856952344\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273958,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273958\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808777,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562427,\n \"\
acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562427\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6548387096774193,\n \"acc_stderr\": 0.02704574657353433,\n \"\
acc_norm\": 0.6548387096774193,\n \"acc_norm_stderr\": 0.02704574657353433\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3793103448275862,\n \"acc_stderr\": 0.034139638059062345,\n \"\
acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.034139638059062345\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.029778663037752954,\n\
\ \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.029778663037752954\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.02529460802398647,\n \
\ \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.02529460802398647\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.02742001935094527,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.02742001935094527\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7688073394495413,\n \"acc_stderr\": 0.018075750241633146,\n \"\
acc_norm\": 0.7688073394495413,\n \"acc_norm_stderr\": 0.018075750241633146\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.33796296296296297,\n \"acc_stderr\": 0.03225941352631295,\n \"\
acc_norm\": 0.33796296296296297,\n \"acc_norm_stderr\": 0.03225941352631295\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693257,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693257\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.02957160106575337,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.02957160106575337\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6367713004484304,\n\
\ \"acc_stderr\": 0.032277904428505,\n \"acc_norm\": 0.6367713004484304,\n\
\ \"acc_norm_stderr\": 0.032277904428505\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597524,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597524\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709390974,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709390974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.756066411238825,\n\
\ \"acc_stderr\": 0.015357212665829465,\n \"acc_norm\": 0.756066411238825,\n\
\ \"acc_norm_stderr\": 0.015357212665829465\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5867052023121387,\n \"acc_stderr\": 0.026511261369409247,\n\
\ \"acc_norm\": 0.5867052023121387,\n \"acc_norm_stderr\": 0.026511261369409247\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3340782122905028,\n\
\ \"acc_stderr\": 0.01577491142238163,\n \"acc_norm\": 0.3340782122905028,\n\
\ \"acc_norm_stderr\": 0.01577491142238163\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631435,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631435\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n\
\ \"acc_stderr\": 0.027513925683549434,\n \"acc_norm\": 0.6237942122186495,\n\
\ \"acc_norm_stderr\": 0.027513925683549434\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100797,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100797\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596154,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596154\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43415906127770537,\n\
\ \"acc_stderr\": 0.01265903323706725,\n \"acc_norm\": 0.43415906127770537,\n\
\ \"acc_norm_stderr\": 0.01265903323706725\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.49264705882352944,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.49264705882352944,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.565359477124183,\n \"acc_stderr\": 0.02005426920072646,\n \
\ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.02005426920072646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.03198761546763127,\n\
\ \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.03198761546763127\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34761321909424725,\n\
\ \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.5163563481502564,\n\
\ \"mc2_stderr\": 0.015552871980185646\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7324388318863457,\n \"acc_stderr\": 0.012441718456893009\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.34721057046979864,\n \
\ \"em_stderr\": 0.004875539476141144,\n \"f1\": 0.4132676174496653,\n\
\ \"f1_stderr\": 0.004730745670924258\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.09401061410159212,\n \"acc_stderr\": 0.008038819818872476\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/TekniumAiroboros-Nebula-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|arc:challenge|25_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|drop|3_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|gsm8k|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hellaswag|10_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-08T17-19-18.874101.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-08T17-19-18.874101.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- '**/details_harness|winogrande|5_2023-11-08T17-19-18.874101.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-08T17-19-18.874101.parquet'
- config_name: results
data_files:
- split: 2023_11_08T17_19_18.874101
path:
- results_2023-11-08T17-19-18.874101.parquet
- split: latest
path:
- results_2023-11-08T17-19-18.874101.parquet
---
# Dataset Card for Evaluation run of Weyaxi/TekniumAiroboros-Nebula-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/TekniumAiroboros-Nebula-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/TekniumAiroboros-Nebula-7B](https://huggingface.co/Weyaxi/TekniumAiroboros-Nebula-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__TekniumAiroboros-Nebula-7B_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-08T17:19:18.874101](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__TekniumAiroboros-Nebula-7B_public/blob/main/results_2023-11-08T17-19-18.874101.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5486392579762343,
"acc_stderr": 0.03374349666110138,
"acc_norm": 0.5573176954338681,
"acc_norm_stderr": 0.03452133425560909,
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.5163563481502564,
"mc2_stderr": 0.015552871980185646,
"em": 0.34721057046979864,
"em_stderr": 0.004875539476141144,
"f1": 0.4132676174496653,
"f1_stderr": 0.004730745670924258
},
"harness|arc:challenge|25": {
"acc": 0.5264505119453925,
"acc_stderr": 0.01459093135812017,
"acc_norm": 0.5716723549488054,
"acc_norm_stderr": 0.014460496367599019
},
"harness|hellaswag|10": {
"acc": 0.6211909978092014,
"acc_stderr": 0.004840990593494692,
"acc_norm": 0.817167894841665,
"acc_norm_stderr": 0.0038573886135331004
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309174,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309174
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.030052580579557845,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.030052580579557845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273958,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273958
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808777,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36772486772486773,
"acc_stderr": 0.024833839825562427,
"acc_norm": 0.36772486772486773,
"acc_norm_stderr": 0.024833839825562427
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377563,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377563
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.02704574657353433,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.02704574657353433
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.034139638059062345,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.034139638059062345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7823834196891192,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.7823834196891192,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.02529460802398647,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.02529460802398647
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.02742001935094527,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.02742001935094527
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7688073394495413,
"acc_stderr": 0.018075750241633146,
"acc_norm": 0.7688073394495413,
"acc_norm_stderr": 0.018075750241633146
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.33796296296296297,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.33796296296296297,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693257,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.02957160106575337,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.02957160106575337
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6367713004484304,
"acc_stderr": 0.032277904428505,
"acc_norm": 0.6367713004484304,
"acc_norm_stderr": 0.032277904428505
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597524,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597524
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709390974,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709390974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.756066411238825,
"acc_stderr": 0.015357212665829465,
"acc_norm": 0.756066411238825,
"acc_norm_stderr": 0.015357212665829465
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5867052023121387,
"acc_stderr": 0.026511261369409247,
"acc_norm": 0.5867052023121387,
"acc_norm_stderr": 0.026511261369409247
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3340782122905028,
"acc_stderr": 0.01577491142238163,
"acc_norm": 0.3340782122905028,
"acc_norm_stderr": 0.01577491142238163
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.027996723180631435,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.027996723180631435
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6237942122186495,
"acc_stderr": 0.027513925683549434,
"acc_norm": 0.6237942122186495,
"acc_norm_stderr": 0.027513925683549434
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100797,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100797
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596154,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596154
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43415906127770537,
"acc_stderr": 0.01265903323706725,
"acc_norm": 0.43415906127770537,
"acc_norm_stderr": 0.01265903323706725
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.49264705882352944,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.49264705882352944,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.565359477124183,
"acc_stderr": 0.02005426920072646,
"acc_norm": 0.565359477124183,
"acc_norm_stderr": 0.02005426920072646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.03198761546763127,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.03198761546763127
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.5163563481502564,
"mc2_stderr": 0.015552871980185646
},
"harness|winogrande|5": {
"acc": 0.7324388318863457,
"acc_stderr": 0.012441718456893009
},
"harness|drop|3": {
"em": 0.34721057046979864,
"em_stderr": 0.004875539476141144,
"f1": 0.4132676174496653,
"f1_stderr": 0.004730745670924258
},
"harness|gsm8k|5": {
"acc": 0.09401061410159212,
"acc_stderr": 0.008038819818872476
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_jilp00__OpenHermes-Symbolic-Mistral-7B | ---
pretty_name: Evaluation run of jilp00/OpenHermes-Symbolic-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jilp00/OpenHermes-Symbolic-Mistral-7B](https://huggingface.co/jilp00/OpenHermes-Symbolic-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jilp00__OpenHermes-Symbolic-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-08T02:06:58.494676](https://huggingface.co/datasets/open-llm-leaderboard/details_jilp00__OpenHermes-Symbolic-Mistral-7B/blob/main/results_2024-01-08T02-06-58.494676.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6047117383672899,\n\
\ \"acc_stderr\": 0.03239674405428559,\n \"acc_norm\": 0.6186896145771656,\n\
\ \"acc_norm_stderr\": 0.03319458836849609,\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757473,\n \"mc2\": 0.4535233817405157,\n\
\ \"mc2_stderr\": 0.016372852954289533\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5017064846416383,\n \"acc_stderr\": 0.014611305705056987,\n\
\ \"acc_norm\": 0.5486348122866894,\n \"acc_norm_stderr\": 0.014542104569955272\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.47520414260107546,\n\
\ \"acc_stderr\": 0.004983641854351149,\n \"acc_norm\": 0.7254530969926309,\n\
\ \"acc_norm_stderr\": 0.004453735900947838\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469553,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469553\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.617948717948718,\n \"acc_stderr\": 0.024635549163908234,\n \
\ \"acc_norm\": 0.617948717948718,\n \"acc_norm_stderr\": 0.024635549163908234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217902,\n \"\
acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217902\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n\
\ \"acc_stderr\": 0.013853724170922526,\n \"acc_norm\": 0.8160919540229885,\n\
\ \"acc_norm_stderr\": 0.013853724170922526\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.21675977653631284,\n\
\ \"acc_stderr\": 0.013780598486443332,\n \"acc_norm\": 0.21675977653631284,\n\
\ \"acc_norm_stderr\": 0.013780598486443332\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388995,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388995\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889016,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889016\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.012739711554045706,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.012739711554045706\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n\
\ \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6454248366013072,\n \"acc_stderr\": 0.019353360547553704,\n \
\ \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.019353360547553704\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675606,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675606\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757473,\n \"mc2\": 0.4535233817405157,\n\
\ \"mc2_stderr\": 0.016372852954289533\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6621941594317285,\n \"acc_stderr\": 0.013292583502910887\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.019711902956785442,\n \
\ \"acc_stderr\": 0.003828982978735705\n }\n}\n```"
repo_url: https://huggingface.co/jilp00/OpenHermes-Symbolic-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|arc:challenge|25_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|arc:challenge|25_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|gsm8k|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|gsm8k|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hellaswag|10_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hellaswag|10_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T22-59-14.313525.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T02-06-58.494676.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T02-06-58.494676.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- '**/details_harness|winogrande|5_2024-01-06T22-59-14.313525.parquet'
- split: 2024_01_08T02_06_58.494676
path:
- '**/details_harness|winogrande|5_2024-01-08T02-06-58.494676.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-08T02-06-58.494676.parquet'
- config_name: results
data_files:
- split: 2024_01_06T22_59_14.313525
path:
- results_2024-01-06T22-59-14.313525.parquet
- split: 2024_01_08T02_06_58.494676
path:
- results_2024-01-08T02-06-58.494676.parquet
- split: latest
path:
- results_2024-01-08T02-06-58.494676.parquet
---
# Dataset Card for Evaluation run of jilp00/OpenHermes-Symbolic-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jilp00/OpenHermes-Symbolic-Mistral-7B](https://huggingface.co/jilp00/OpenHermes-Symbolic-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jilp00__OpenHermes-Symbolic-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T02:06:58.494676](https://huggingface.co/datasets/open-llm-leaderboard/details_jilp00__OpenHermes-Symbolic-Mistral-7B/blob/main/results_2024-01-08T02-06-58.494676.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6047117383672899,
"acc_stderr": 0.03239674405428559,
"acc_norm": 0.6186896145771656,
"acc_norm_stderr": 0.03319458836849609,
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757473,
"mc2": 0.4535233817405157,
"mc2_stderr": 0.016372852954289533
},
"harness|arc:challenge|25": {
"acc": 0.5017064846416383,
"acc_stderr": 0.014611305705056987,
"acc_norm": 0.5486348122866894,
"acc_norm_stderr": 0.014542104569955272
},
"harness|hellaswag|10": {
"acc": 0.47520414260107546,
"acc_stderr": 0.004983641854351149,
"acc_norm": 0.7254530969926309,
"acc_norm_stderr": 0.004453735900947838
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469553,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469553
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.617948717948718,
"acc_stderr": 0.024635549163908234,
"acc_norm": 0.617948717948718,
"acc_norm_stderr": 0.024635549163908234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.017090573804217902,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.017090573804217902
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922526,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.21675977653631284,
"acc_stderr": 0.013780598486443332,
"acc_norm": 0.21675977653631284,
"acc_norm_stderr": 0.013780598486443332
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388995,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388995
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045706,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045706
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.019353360547553704,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.019353360547553704
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675606,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675606
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757473,
"mc2": 0.4535233817405157,
"mc2_stderr": 0.016372852954289533
},
"harness|winogrande|5": {
"acc": 0.6621941594317285,
"acc_stderr": 0.013292583502910887
},
"harness|gsm8k|5": {
"acc": 0.019711902956785442,
"acc_stderr": 0.003828982978735705
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
serkandyck/turkish_instructions | ---
language:
- tr
---
https://huggingface.co/datasets/merve/turkish_instructions source translated for llama 2 fine tuning |
houck2040/agri | ---
license: mit
---
|
open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.2 | ---
pretty_name: Evaluation run of MaziyarPanahi/Calme-7B-Instruct-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/Calme-7B-Instruct-v0.2](https://huggingface.co/MaziyarPanahi/Calme-7B-Instruct-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T22:27:40.355273](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.2/blob/main/results_2024-03-24T22-27-40.355273.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6502193531305914,\n\
\ \"acc_stderr\": 0.03206041212180262,\n \"acc_norm\": 0.6492744667926741,\n\
\ \"acc_norm_stderr\": 0.032734535923634796,\n \"mc1\": 0.6340269277845777,\n\
\ \"mc1_stderr\": 0.016862941684088386,\n \"mc2\": 0.7800396830271358,\n\
\ \"mc2_stderr\": 0.013692121372824433\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.71160409556314,\n \"acc_stderr\": 0.013238394422428173,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710695\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7178848834893448,\n\
\ \"acc_stderr\": 0.00449109352811341,\n \"acc_norm\": 0.8918542123083051,\n\
\ \"acc_norm_stderr\": 0.0030992974183235455\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944427,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944427\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n\
\ \"acc_stderr\": 0.016519594275297117,\n \"acc_norm\": 0.4223463687150838,\n\
\ \"acc_norm_stderr\": 0.016519594275297117\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47783572359843546,\n\
\ \"acc_stderr\": 0.012757683047716175,\n \"acc_norm\": 0.47783572359843546,\n\
\ \"acc_norm_stderr\": 0.012757683047716175\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6340269277845777,\n\
\ \"mc1_stderr\": 0.016862941684088386,\n \"mc2\": 0.7800396830271358,\n\
\ \"mc2_stderr\": 0.013692121372824433\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479674\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7005307050796058,\n \
\ \"acc_stderr\": 0.012616300735519658\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/Calme-7B-Instruct-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|arc:challenge|25_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|gsm8k|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hellaswag|10_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T22-27-40.355273.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T22-27-40.355273.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- '**/details_harness|winogrande|5_2024-03-24T22-27-40.355273.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T22-27-40.355273.parquet'
- config_name: results
data_files:
- split: 2024_03_24T22_27_40.355273
path:
- results_2024-03-24T22-27-40.355273.parquet
- split: latest
path:
- results_2024-03-24T22-27-40.355273.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/Calme-7B-Instruct-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/Calme-7B-Instruct-v0.2](https://huggingface.co/MaziyarPanahi/Calme-7B-Instruct-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T22:27:40.355273](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.2/blob/main/results_2024-03-24T22-27-40.355273.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6502193531305914,
"acc_stderr": 0.03206041212180262,
"acc_norm": 0.6492744667926741,
"acc_norm_stderr": 0.032734535923634796,
"mc1": 0.6340269277845777,
"mc1_stderr": 0.016862941684088386,
"mc2": 0.7800396830271358,
"mc2_stderr": 0.013692121372824433
},
"harness|arc:challenge|25": {
"acc": 0.71160409556314,
"acc_stderr": 0.013238394422428173,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710695
},
"harness|hellaswag|10": {
"acc": 0.7178848834893448,
"acc_stderr": 0.00449109352811341,
"acc_norm": 0.8918542123083051,
"acc_norm_stderr": 0.0030992974183235455
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944427,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944427
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4223463687150838,
"acc_stderr": 0.016519594275297117,
"acc_norm": 0.4223463687150838,
"acc_norm_stderr": 0.016519594275297117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47783572359843546,
"acc_stderr": 0.012757683047716175,
"acc_norm": 0.47783572359843546,
"acc_norm_stderr": 0.012757683047716175
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6340269277845777,
"mc1_stderr": 0.016862941684088386,
"mc2": 0.7800396830271358,
"mc2_stderr": 0.013692121372824433
},
"harness|winogrande|5": {
"acc": 0.8492501973164956,
"acc_stderr": 0.010056094631479674
},
"harness|gsm8k|5": {
"acc": 0.7005307050796058,
"acc_stderr": 0.012616300735519658
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
BrandonZYW/IntentEmotion | ---
configs:
- config_name: intent
data_files:
- split: test
path: intent.csv
- config_name: emotion
data_files:
- split: test
path: emotion.csv
license: mit
--- |
yongchoooon/fire-aihub-new-chatgpt | ---
license: cc-by-nc-sa-4.0
annotations_creators:
- machine-generated
language:
- en
language_creators:
- other
multilinguality:
- monolingual
pretty_name: fire-aihub-new-chatgpt
size_categories:
- n<1K
tags: []
task_categories:
- text-to-image
task_ids: []
--- |
tyzhu/wiki_find_passage_train50_eval40_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 95616
num_examples: 140
- name: validation
num_bytes: 33448
num_examples: 40
download_size: 74460
dataset_size: 129064
---
# Dataset Card for "wiki_find_passage_train50_eval40_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713200378 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 21242
num_examples: 57
download_size: 19679
dataset_size: 21242
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713200378"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LambdaTests/VQAv2_sample_validation_benchmarks_partition_global_3_loca_3 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 41
num_examples: 1
download_size: 0
dataset_size: 41
---
# Dataset Card for "VQAv2_sample_validation_benchmarks_partition_global_3_loca_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bdsaglam/webnlg-jerx-sft-mt-ss-openai | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 13562315
num_examples: 17636
- name: dev
num_bytes: 1718829
num_examples: 2249
- name: test
num_bytes: 3051253
num_examples: 3668
download_size: 5365811
dataset_size: 18332397
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
|
vikp/hydra_inst_labeled_bad | ---
dataset_info:
features:
- name: unique_conversation_id
dtype: string
- name: rendered
dtype: string
- name: dataset_id
dtype: string
- name: inst_prob
dtype: float64
splits:
- name: train
num_bytes: 90343785.37738979
num_examples: 47604
download_size: 32011958
dataset_size: 90343785.37738979
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "hydra_inst_labeled_bad"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-samsum-22cb3f56-11055482 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: pszemraj/led-base-book-summary
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/led-base-book-summary
* Dataset: samsum
* Config: samsum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
occiglot/tokenizer-wiki-bench | ---
license: mit
pretty_name: Multilingual Tokenizer Wikipedia Benchmark
dataset_info:
- config_name: af
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 541481060
num_examples: 112518
- name: clean
num_bytes: 539551289.6071739
num_examples: 112117
download_size: 441191361
dataset_size: 1081032349.607174
- config_name: ar
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 7007645793
num_examples: 1219201
- name: clean
num_bytes: 6980694657.688122
num_examples: 1214512
download_size: 4415559180
dataset_size: 13988340450.688122
- config_name: bg
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 2437923560
num_examples: 294275
- name: clean
num_bytes: 2433855866.6248918
num_examples: 293784
download_size: 1805069655
dataset_size: 4871779426.624891
- config_name: ca
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 4801022979
num_examples: 737409
- name: clean
num_bytes: 4766991732.959834
num_examples: 732182
download_size: 3884482903
dataset_size: 9568014711.959835
- config_name: cs
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 3740905267
num_examples: 534044
- name: clean
num_bytes: 3730243864.91258
num_examples: 532522
download_size: 3671037924
dataset_size: 7471149131.9125805
- config_name: da
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 1298277678
num_examples: 295347
- name: clean
num_bytes: 1292602738.074089
num_examples: 294056
download_size: 1782396281
dataset_size: 2590880416.074089
- config_name: de
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 23086869184
num_examples: 2845308
- name: clean
num_bytes: 23073148386.18474
num_examples: 2843617
download_size: 21942020975
dataset_size: 46160017570.18474
- config_name: el
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 3002968703
num_examples: 226834
- name: clean
num_bytes: 2973684879.714972
num_examples: 224622
download_size: 2295250961
dataset_size: 5976653582.714972
- config_name: en
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 49746869820
num_examples: 6407814
- name: clean
num_bytes: 49560903666.851944
num_examples: 6383860
download_size: 40592018321
dataset_size: 99307773486.85194
- config_name: es
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 14759846818
num_examples: 1841155
- name: clean
num_bytes: 14536992695.618353
num_examples: 1813356
download_size: 12175892555
dataset_size: 29296839513.618355
- config_name: et
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 1019050491
num_examples: 240397
- name: clean
num_bytes: 1016723262.6254404
num_examples: 239848
download_size: 1019164563
dataset_size: 2035773753.6254404
- config_name: eu
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 1291195010
num_examples: 416347
- name: clean
num_bytes: 1265327506.262949
num_examples: 408006
download_size: 968840915
dataset_size: 2556522516.262949
- config_name: fa
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 4224898253
num_examples: 979869
- name: clean
num_bytes: 4213433450.6083264
num_examples: 977210
download_size: 2499698548
dataset_size: 8438331703.608326
- config_name: fi
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 2600737260
num_examples: 561598
- name: clean
num_bytes: 2595874753.1481237
num_examples: 560548
download_size: 2642007766
dataset_size: 5196612013.148124
- config_name: fr
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 20069732840
num_examples: 2564646
- name: clean
num_bytes: 19942544382.860683
num_examples: 2548393
download_size: 16151551755
dataset_size: 40012277222.86069
- config_name: ga
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 142209710
num_examples: 59156
- name: clean
num_bytes: 141702470.68682805
num_examples: 58945
download_size: 121745838
dataset_size: 283912180.686828
- config_name: he
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 4776226234
num_examples: 333874
- name: clean
num_bytes: 4760232712.702708
num_examples: 332756
download_size: 3499530576
dataset_size: 9536458946.70271
- config_name: hi
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 1452853579
num_examples: 163093
- name: clean
num_bytes: 1443152625.8779714
num_examples: 162004
download_size: 785363639
dataset_size: 2896006204.8779716
- config_name: hr
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 1064630680
num_examples: 202848
- name: clean
num_bytes: 1053026432.3195693
num_examples: 200637
download_size: 1028743775
dataset_size: 2117657112.3195693
- config_name: hu
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 3533169653
num_examples: 532427
- name: clean
num_bytes: 3510335279.8822336
num_examples: 528986
download_size: 3558613373
dataset_size: 7043504932.882234
- config_name: hy
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 2568868378
num_examples: 303036
- name: clean
num_bytes: 2555898405.394963
num_examples: 301506
download_size: 1781142597
dataset_size: 5124766783.394962
- config_name: id
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 2650288629
num_examples: 665622
- name: clean
num_bytes: 2630666948.280745
num_examples: 660694
download_size: 2040186206
dataset_size: 5280955577.280745
- config_name: it
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 12188918391
num_examples: 1833639
- name: clean
num_bytes: 12163279397.591763
num_examples: 1829782
download_size: 10368836428
dataset_size: 24352197788.591763
- config_name: ja
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 18752888787
num_examples: 1389467
- name: clean
num_bytes: 18684866617.717476
num_examples: 1384427
download_size: 15232900753
dataset_size: 37437755404.717476
- config_name: ko
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 3160932689
num_examples: 647897
- name: clean
num_bytes: 3151741108.878351
num_examples: 646013
download_size: 3074385022
dataset_size: 6312673797.878351
- config_name: lt
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 781319902
num_examples: 211292
- name: clean
num_bytes: 777474168.616436
num_examples: 210252
download_size: 722780874
dataset_size: 1558794070.616436
- config_name: lv
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 531956241
num_examples: 123413
- name: clean
num_bytes: 530943303.00615007
num_examples: 123178
download_size: 700342420
dataset_size: 1062899544.00615
- config_name: mr
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 547060763
num_examples: 94133
- name: clean
num_bytes: 545450957.3914355
num_examples: 93856
download_size: 278141890
dataset_size: 1092511720.3914356
- config_name: nl
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 6191062892
num_examples: 2135977
- name: clean
num_bytes: 6177393712.697661
num_examples: 2131261
download_size: 5179824678
dataset_size: 12368456604.697662
- config_name: 'no'
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 2437191515
num_examples: 617937
- name: clean
num_bytes: 2428893175.610127
num_examples: 615833
download_size: 2175299531
dataset_size: 4866084690.6101265
- config_name: pl
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 6864626419
num_examples: 1587721
- name: clean
num_bytes: 6861024883.335341
num_examples: 1586888
download_size: 6565864124
dataset_size: 13725651302.335342
- config_name: pt
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 6844185526
num_examples: 1112246
- name: clean
num_bytes: 6755821527.2502985
num_examples: 1097886
download_size: 5516209748
dataset_size: 13600007053.250298
- config_name: ro
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 2023493174
num_examples: 442389
- name: clean
num_bytes: 2006866635.6197736
num_examples: 438754
download_size: 1652633599
dataset_size: 4030359809.619774
- config_name: ru
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 22550679128
num_examples: 1945063
- name: clean
num_bytes: 22439204702.844765
num_examples: 1935448
download_size: 18884603758
dataset_size: 44989883830.844765
- config_name: sa
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 146460109
num_examples: 12156
- name: clean
num_bytes: 145435996.68797302
num_examples: 12071
download_size: 95836795
dataset_size: 291896105.687973
- config_name: sk
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 977962245
num_examples: 242235
- name: clean
num_bytes: 976048590.4738994
num_examples: 241761
download_size: 1346611201
dataset_size: 1954010835.4738994
- config_name: sl
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 1106532891
num_examples: 183006
- name: clean
num_bytes: 1097995332.4385757
num_examples: 181594
download_size: 1006028852
dataset_size: 2204528223.4385757
- config_name: sr
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 3755288114
num_examples: 676605
- name: clean
num_bytes: 3735557179.0449376
num_examples: 673050
download_size: 2558022832
dataset_size: 7490845293.044937
- config_name: sv
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 4861956987
num_examples: 2574513
- name: clean
num_bytes: 4857071448.365948
num_examples: 2571926
download_size: 3512612936
dataset_size: 9719028435.365948
- config_name: ta
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 1693909025
num_examples: 160651
- name: clean
num_bytes: 1682405487.85255
num_examples: 159560
download_size: 985318775
dataset_size: 3376314512.85255
- config_name: te
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 1556095028
num_examples: 87854
- name: clean
num_bytes: 1550320823.3066678
num_examples: 87528
download_size: 746686495
dataset_size: 3106415851.306668
- config_name: tr
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 2317236022
num_examples: 534988
- name: clean
num_bytes: 2301578085.336879
num_examples: 531373
download_size: 2055444454
dataset_size: 4618814107.336879
- config_name: uk
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 10935662610
num_examples: 1294720
- name: clean
num_bytes: 10860532296.947023
num_examples: 1285825
download_size: 8344390939
dataset_size: 21796194906.94702
- config_name: ur
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 918249794
num_examples: 200154
- name: clean
num_bytes: 912616078.225986
num_examples: 198926
download_size: 534834968
dataset_size: 1830865872.225986
- config_name: vi
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: split_text
sequence: string
splits:
- name: train
num_bytes: 3685585608
num_examples: 1288680
download_size: 1346636217
dataset_size: 3685585608
configs:
- config_name: af
data_files:
- split: train
path: af/train-*
- split: clean
path: af/clean-*
- config_name: ar
data_files:
- split: train
path: ar/train-*
- split: clean
path: ar/clean-*
- config_name: bg
data_files:
- split: train
path: bg/train-*
- split: clean
path: bg/clean-*
- config_name: ca
data_files:
- split: train
path: ca/train-*
- split: clean
path: ca/clean-*
- config_name: cs
data_files:
- split: train
path: cs/train-*
- split: clean
path: cs/clean-*
- config_name: da
data_files:
- split: train
path: da/train-*
- split: clean
path: da/clean-*
- config_name: de
data_files:
- split: train
path: de/train-*
- split: clean
path: de/clean-*
- config_name: el
data_files:
- split: train
path: el/train-*
- split: clean
path: el/clean-*
- config_name: en
data_files:
- split: train
path: en/train-*
- split: clean
path: en/clean-*
- config_name: es
data_files:
- split: train
path: es/train-*
- split: clean
path: es/clean-*
- config_name: et
data_files:
- split: train
path: et/train-*
- split: clean
path: et/clean-*
- config_name: eu
data_files:
- split: train
path: eu/train-*
- split: clean
path: eu/clean-*
- config_name: fa
data_files:
- split: train
path: fa/train-*
- split: clean
path: fa/clean-*
- config_name: fi
data_files:
- split: train
path: fi/train-*
- split: clean
path: fi/clean-*
- config_name: fr
data_files:
- split: train
path: fr/train-*
- split: clean
path: fr/clean-*
- config_name: ga
data_files:
- split: train
path: ga/train-*
- split: clean
path: ga/clean-*
- config_name: he
data_files:
- split: train
path: he/train-*
- split: clean
path: he/clean-*
- config_name: hi
data_files:
- split: train
path: hi/train-*
- split: clean
path: hi/clean-*
- config_name: hr
data_files:
- split: train
path: hr/train-*
- split: clean
path: hr/clean-*
- config_name: hu
data_files:
- split: train
path: hu/train-*
- split: clean
path: hu/clean-*
- config_name: hy
data_files:
- split: train
path: hy/train-*
- split: clean
path: hy/clean-*
- config_name: id
data_files:
- split: train
path: id/train-*
- split: clean
path: id/clean-*
- config_name: it
data_files:
- split: train
path: it/train-*
- split: clean
path: it/clean-*
- config_name: ja
data_files:
- split: train
path: ja/train-*
- split: clean
path: ja/clean-*
- config_name: ko
data_files:
- split: train
path: ko/train-*
- split: clean
path: ko/clean-*
- config_name: lt
data_files:
- split: train
path: lt/train-*
- split: clean
path: lt/clean-*
- config_name: lv
data_files:
- split: train
path: lv/train-*
- split: clean
path: lv/clean-*
- config_name: mr
data_files:
- split: train
path: mr/train-*
- split: clean
path: mr/clean-*
- config_name: nl
data_files:
- split: train
path: nl/train-*
- split: clean
path: nl/clean-*
- config_name: 'no'
data_files:
- split: train
path: no/train-*
- split: clean
path: no/clean-*
- config_name: pl
data_files:
- split: train
path: pl/train-*
- split: clean
path: pl/clean-*
- config_name: pt
data_files:
- split: train
path: pt/train-*
- split: clean
path: pt/clean-*
- config_name: ro
data_files:
- split: train
path: ro/train-*
- split: clean
path: ro/clean-*
- config_name: ru
data_files:
- split: train
path: ru/train-*
- split: clean
path: ru/clean-*
- config_name: sa
data_files:
- split: train
path: sa/train-*
- split: clean
path: sa/clean-*
- config_name: sk
data_files:
- split: train
path: sk/train-*
- split: clean
path: sk/clean-*
- config_name: sl
data_files:
- split: train
path: sl/train-*
- split: clean
path: sl/clean-*
- config_name: sr
data_files:
- split: train
path: sr/train-*
- split: clean
path: sr/clean-*
- config_name: sv
data_files:
- split: train
path: sv/train-*
- split: clean
path: sv/clean-*
- config_name: ta
data_files:
- split: train
path: ta/train-*
- split: clean
path: ta/clean-*
- config_name: te
data_files:
- split: train
path: te/train-*
- split: clean
path: te/clean-*
- config_name: tr
data_files:
- split: train
path: tr/train-*
- split: clean
path: tr/clean-*
- config_name: uk
data_files:
- split: train
path: uk/train-*
- split: clean
path: uk/clean-*
- config_name: ur
data_files:
- split: train
path: ur/train-*
- split: clean
path: ur/clean-*
- config_name: vi
data_files:
- split: train
path: vi/train-*
language:
- af
- ar
- bg
- ca
- cs
- da
- de
- el
- en
- es
- et
- eu
- fa
- fi
- fr
- ga
- he
- hi
- hr
- hu
- hy
- id
- it
- ja
- ko
- lt
- lv
- mr
- nl
- 'no'
- pl
- pt
- ro
- ru
- sa
- sk
- sl
- sr
- sv
- ta
- te
- tr
- uk
- ur
- vi
---
# Multilingual Tokenizer Benchmark
This dataset includes pre-processed wikipedia data for tokenizer evaluation in [45 languages](https://huggingface.co/datasets/occiglot/tokenizer-wiki-bench/blob/main/README.md#supported-languages). We provide more information on the evaluation task in general [this blogpost](https://occiglot.github.io/occiglot/posts/eu_tokenizer_perfomance/).
## Usage
The dataset allows us to easily calculate *tokenizer fertility* and the *proportion of continued words* on any of the supported languages. In the example below we take the Mistral tokenizer and evaluate its performance on Slovak.
```python
from transformers import AutoTokenizer
from datasets import load_dataset
import numpy as np
def calculate_metrics(tokens):
tmp = np.array([len(y) for y in tokens])
return {'fertility': np.mean(tmp), 'cont_prop': np.count_nonzero(tmp > 1) / tmp.shape[0]}
tokenizer_name = 'mistralai/Mistral-7B-v0.1'
language = 'sk' #Slovak
tokenizer = AutoTokenizer.from_pretrained(tokenizer_name)
ds = load_dataset('occiglot/tokenizer-wiki-bench', name=language, split='clean')
remove_columns = list(set(ds.column_names) - set(["text"]))
ds = ds.map(lambda x: {'tokens': tokenizer(x['split_text'], add_special_tokens=False)['input_ids']} ,num_proc=256, remove_columns=remove_columns, batched=False)
remove_columns = None#list(set(ds.column_names))
ds = ds.map(lambda x: calculate_metrics(x['tokens']), num_proc=256, remove_columns=remove_columns, batched=False)
df = ds.to_pandas()
print('Fertility: ', df.fertility.mean())
print('Prop. continued words:', df.cont_prop.mean())
```
## Dataset Creation
We loosely follow the approach of [Rust _et al.](https://arxiv.org/abs/2012.15613) using the fast [UDPipe](https://ufal.mff.cuni.cz/udpipe) to pre-split documents into words and subsequently run the tokenizer over isolated words. For all languages we use the respective November 2023 snapshot from [Wikipedia](wikimedia/wikipedia). Since Wikipedia, by nature, contains significantly more numbers and dates than other text and most tokenizers split those into single digits, we filtered all lone-standing numbers from the documents. Additionally, we removed any documents that still contained non-parsed HTML code (less than 1%).
## Licensing
We release our curated benchmark and any associated code under [MIT](https://opensource.org/license/mit) license. However, depending on your use case, the licensing conditions of the original [Wikipedia data](https://huggingface.co/datasets/wikimedia/wikipedia#licensing-information) and [UDPipe](https://github.com/ufal/udpipe/tree/udpipe-2?tab=License-1-ov-file) may apply.
## Supported Languages
This dataset currently contains pre-processed data for the following languages:
| Language | Code |
|:-----------|:-------|
| Afrikaans | af |
| Arabic | ar |
| Armenian | hy |
| Basque | eu |
| Bulgarian | bg |
| Catalan | ca |
| Croatian | hr |
| Czech | cs |
| Danish | da |
| Dutch | nl |
| English | en |
| Estonian | et |
| Finnish | fi |
| French | fr |
| German | de |
| Greek | el |
| Hebrew | he |
| Hindi | hi |
| Hungarian | hu |
| Indonesian | id |
| Irish | ga |
| Italian | it |
| Japanese | ja |
| Korean | ko |
| Latvian | lv |
| Lithuanian | lt |
| Marathi | mr |
| Norwegian | no |
| Persian | fa |
| Polish | pl |
| Portuguese | pt |
| Romanian | ro |
| Russian | ru |
| Sanskrit | sa |
| Serbian | sr |
| Slovak | sk |
| Slovenian | sl |
| Spanish | es |
| Swedish | sv |
| Tamil | ta |
| Telugu | te |
| Turkish | tr |
| Ukrainian | uk |
| Urdu | ur |
| Vietnamese | vi | |
josephloh/donut-receipts123 | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 16831476.0
num_examples: 100
- name: validation
num_bytes: 2401425.0
num_examples: 11
- name: test
num_bytes: 2915735.0
num_examples: 12
download_size: 22077520
dataset_size: 22148636.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
HuggingFaceM4/idefics2-bias-eval | Invalid username or password. |
bismillah-hf/bsb-llama-108 | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 43406
num_examples: 108
download_size: 26617
dataset_size: 43406
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
shivam1021/Stack-Exchange-reduced | ---
license: openrail
---
|
tomfern/oxford-pets-subset | ---
license: cc-by-sa-4.0
task_categories:
- image-classification
language:
- en
tags:
- code
pretty_name: Oxford Pets Subset
size_categories:
- n<1K
--- |
typosonlr/MentalHealthPreProcessed | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4437356
num_examples: 3000
download_size: 1481045
dataset_size: 4437356
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HuggingFaceGECLM/REDDIT_comments | ---
dataset_info:
features:
- name: archived
dtype: string
- name: author
dtype: string
- name: author_fullname
dtype: string
- name: body
dtype: string
- name: comment_type
dtype: string
- name: controversiality
dtype: string
- name: created_utc
dtype: string
- name: edited
dtype: string
- name: gilded
dtype: string
- name: id
dtype: string
- name: link_id
dtype: string
- name: locked
dtype: string
- name: name
dtype: string
- name: parent_id
dtype: string
- name: permalink
dtype: string
- name: retrieved_on
dtype: string
- name: score
dtype: string
- name: subreddit_id
dtype: string
- name: subreddit_name_prefixed
dtype: string
- name: subreddit_type
dtype: string
- name: total_awards_received
dtype: string
splits:
- name: programming
num_bytes: 3466623746
num_examples: 7503347
- name: tifu
num_bytes: 4761338653
num_examples: 12738669
- name: explainlikeimfive
num_bytes: 8451732573
num_examples: 16392814
- name: WritingPrompts
num_bytes: 4651591771
num_examples: 4436210
- name: changemyview
num_bytes: 8603031915
num_examples: 11600073
- name: LifeProTips
num_bytes: 5272994396
num_examples: 12829459
- name: todayilearned
num_bytes: 22655655241
num_examples: 60199778
- name: science
num_bytes: 7069809765
num_examples: 18112884
- name: askscience
num_bytes: 3144754665
num_examples: 6286702
- name: ifyoulikeblank
num_bytes: 547200329
num_examples: 1332211
- name: Foodforthought
num_bytes: 308377128
num_examples: 567900
- name: IWantToLearn
num_bytes: 408331672
num_examples: 745543
- name: bestof
num_bytes: 2003718831
num_examples: 4347522
- name: IAmA
num_bytes: 9380094090
num_examples: 25778822
- name: socialskills
num_bytes: 1000014402
num_examples: 1842733
- name: relationship_advice
num_bytes: 22298879735
num_examples: 38937398
- name: philosophy
num_bytes: 1494947876
num_examples: 2391695
- name: YouShouldKnow
num_bytes: 1165617658
num_examples: 2639265
- name: history
num_bytes: 1457852402
num_examples: 2962043
- name: books
num_bytes: 4562689426
num_examples: 10187495
- name: Showerthoughts
num_bytes: 13259109532
num_examples: 34123213
- name: personalfinance
num_bytes: 9484869588
num_examples: 18361314
- name: buildapc
num_bytes: 9801044390
num_examples: 21761801
- name: EatCheapAndHealthy
num_bytes: 853462012
num_examples: 1821897
- name: boardgames
num_bytes: 3131627378
num_examples: 6328926
- name: malefashionadvice
num_bytes: 2928017882
num_examples: 7712258
- name: femalefashionadvice
num_bytes: 1619784736
num_examples: 3262969
- name: scifi
num_bytes: 888152056
num_examples: 2193741
- name: Fantasy
num_bytes: 2285934538
num_examples: 4566639
- name: Games
num_bytes: 10396813188
num_examples: 23373965
- name: bodyweightfitness
num_bytes: 794549854
num_examples: 1613634
- name: SkincareAddiction
num_bytes: 3421122597
num_examples: 5660550
- name: podcasts
num_bytes: 464773126
num_examples: 943266
- name: suggestmeabook
num_bytes: 1842944304
num_examples: 3492937
- name: AskHistorians
num_bytes: 2244587909
num_examples: 2714353
- name: gaming
num_bytes: 28374513722
num_examples: 85729253
- name: DIY
num_bytes: 2113533684
num_examples: 4489265
- name: sports
num_bytes: 2230129132
num_examples: 6470079
- name: space
num_bytes: 3081499208
num_examples: 7896182
- name: gadgets
num_bytes: 1683252868
num_examples: 4104833
- name: Documentaries
num_bytes: 1852644771
num_examples: 4051474
- name: GetMotivated
num_bytes: 1211761267
num_examples: 3221980
- name: UpliftingNews
num_bytes: 2003149025
num_examples: 4741948
- name: technology
num_bytes: 10826871436
num_examples: 25404699
- name: Fitness
num_bytes: 6191132755
num_examples: 14319856
- name: travel
num_bytes: 1740556350
num_examples: 3806755
- name: lifehacks
num_bytes: 626791812
num_examples: 1799437
- name: Damnthatsinteresting
num_bytes: 6376694618
num_examples: 15643554
- name: gardening
num_bytes: 1825313940
num_examples: 4568468
- name: mildlyinteresting
num_bytes: 9079894206
num_examples: 26436769
download_size: 109177016105
dataset_size: 255339788158
annotations_creators:
- no-annotation
language:
- en
language_creators:
- found
license: []
multilinguality:
- monolingual
pretty_name: Reddit comments
size_categories:
- 10B<n<100B
source_datasets: []
tags:
- reddit
- social-media
task_categories:
- text-generation
task_ids:
- dialogue-modeling
- language-modeling
---
# Dataset Card for "REDDIT_comments"
## Dataset Description
- **Homepage:**
- **Paper: https://arxiv.org/abs/2001.08435**
### Dataset Summary
Comments of 50 high-quality subreddits, extracted from the REDDIT PushShift data dumps (from 2006 to Jan 2023).
### Supported Tasks
These comments can be used for text generation and language modeling, as well as dialogue modeling.
## Dataset Structure
### Data Splits
Each split corresponds to a specific subreddit in the following list: "tifu", "explainlikeimfive", "WritingPrompts", "changemyview", "LifeProTips", "todayilearned", "science", "askscience", "ifyoulikeblank", "Foodforthought", "IWantToLearn", "bestof", "IAmA", "socialskills", "relationship_advice", "philosophy", "YouShouldKnow", "history", "books", "Showerthoughts", "personalfinance", "buildapc", "EatCheapAndHealthy", "boardgames", "malefashionadvice", "femalefashionadvice", "scifi", "Fantasy", "Games", "bodyweightfitness", "SkincareAddiction", "podcasts", "suggestmeabook", "AskHistorians", "gaming", "DIY", "mildlyinteresting", "sports", "space", "gadgets", "Documentaries", "GetMotivated", "UpliftingNews", "technology", "Fitness", "travel", "lifehacks", "Damnthatsinteresting", "gardening", "programming"
## Dataset Creation
### Curation Rationale
All the information fields have been cast to string, as their format change through time from one dump to the following. A reduced number of keys have been kept: "archived", "author", "author_fullname", "body", "comment_type", "controversiality", "created_utc", "edited", "gilded", "id", "link_id", "locked", "name", "parent_id", "permalink", "retrieved_on", "score", "subreddit", "subreddit_id", "subreddit_name_prefixed", "subreddit_type", "total_awards_received".
### Source Data
The [Reddit PushShift data dumps](https://files.pushshift.io/reddit/) are part of a data collection effort which crawls Reddit at regular intervals, to extract and keep all its data.
#### Initial Data Collection and Normalization
See the paper.
#### Who are the source language producers?
Redditors are mostly young (65% below 30), male (70%), and American (50% of the site).
### Personal and Sensitive Information
The data contains Redditor's usernames associated to their content.
## Considerations for Using the Data
This dataset should be anonymized before any processing.
Though the subreddits selected are considered as being of higher quality, they can still reflect what you can find on the internet in terms of expressions of biases and toxicity.
### Contributions
Thanks to [@clefourrier](https://github.com/clefourrier) for adding this dataset. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.