datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
ZhangCNN/MindData_zh | ---
license: apache-2.0
---
|
Birchlabs/openai-prm800k-phase1_test-solutions-only | ---
license: mit
---
|
Fsoft-AIC/the-vault-inline | ---
language:
- code
- en
multilinguality:
- multiprogramming languages
task_categories:
- text-generation
license: mit
dataset_info:
features:
- name: identifier
dtype: string
- name: return_type
dtype: string
- name: repo
dtype: string
- name: path
dtype: string
- name: language
dtype: string
- name: code
dtype: string
- name: code_tokens
dtype: string
- name: original_docstring
dtype: string
- name: comment
dtype: string
- name: docstring_tokens
dtype: string
- name: docstring
dtype: string
- name: original_string
dtype: string
pretty_name: The Vault Function
viewer: true
---
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Statistics](#dataset-statistics)
- [Usage](#usage)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [FSoft-AI4Code/TheVault](https://github.com/FSoft-AI4Code/TheVault)
- **Paper:** [The Vault: A Comprehensive Multilingual Dataset for Advancing Code Understanding and Generation](https://arxiv.org/abs/2305.06156)
- **Contact:** support.ailab@fpt.com
- **Website:** https://www.fpt-aicenter.com/ai-residency/
<p align="center">
<img src="https://raw.githubusercontent.com/FSoft-AI4Code/TheVault/main/assets/the-vault-4-logo-png.png" width="300px" alt="logo">
</p>
<div align="center">
# The Vault: A Comprehensive Multilingual Dataset for Advancing Code Understanding and Generation
</div>
## Dataset Summary
The Vault dataset is a comprehensive, large-scale, multilingual parallel dataset that features high-quality code-text pairs derived from The Stack, the largest permissively-licensed source code dataset.
We provide The Vault which contains code snippets from 10 popular programming languages such as Java, JavaScript, Python, Ruby, Rust, Golang, C#, C++, C, and PHP. This dataset provides multiple code-snippet levels, metadata, and 11 docstring styles for enhanced usability and versatility.
## Supported Tasks
The Vault can be used for pretraining LLMs or downstream code-text interaction tasks. A number of tasks related to code understanding and geneartion can be constructed using The Vault such as *code summarization*, *text-to-code generation* and *code search*.
## Languages
The natural language text (docstring) is in English.
10 programming languages are supported in The Vault: `Python`, `Java`, `JavaScript`, `PHP`, `C`, `C#`, `C++`, `Go`, `Ruby`, `Rust`
## Dataset Structure
### Data Instances
```
{
"hexsha": "ee1cf38808d3db0ea364b049509a01a65e6e5589",
"repo": "Waguy02/Boomer-Scripted",
"path": "python/subprojects/testbed/mlrl/testbed/persistence.py",
"license": [
"MIT"
],
"language": "Python",
"identifier": "__init__",
"code": "def __init__(self, model_dir: str):\n \"\"\"\n :param model_dir: The path of the directory where models should be saved\n \"\"\"\n self.model_dir = model_dir",
"code_tokens": [
"def",
"__init__",
"(",
"self",
",",
"model_dir",
":",
"str",
")",
":",
"\"\"\"\n :param model_dir: The path of the directory where models should be saved\n \"\"\"",
"self",
".",
"model_dir",
"=",
"model_dir"
],
"original_comment": "\"\"\"\n :param model_dir: The path of the directory where models should be saved\n \"\"\"",
"comment": ":param model_dir: The path of the directory where models should be saved",
"comment_tokens": [
":",
"param",
"model_dir",
":",
"The",
"path",
"of",
"the",
"directory",
"where",
"models",
"should",
"be",
"saved"
],
"start_point": [
1,
8
],
"end_point": [
3,
11
],
"prev_context": {
"code": null,
"start_point": null,
"end_point": null
},
"next_context": {
"code": "self.model_dir = model_dir",
"start_point": [
4,
8
],
"end_point": [
4,
34
]
}
}
```
### Data Fields
Data fields for inline level:
- **hexsha** (string): the unique git hash of file
- **repo** (string): the owner/repo
- **path** (string): the full path to the original file
- **license** (list): licenses in the repo
- **language** (string): the programming language
- **identifier** (string): the function or method name
- **code** (string): the part of the original that is code
- **code_tokens** (list): tokenized version of `code`
- **original_comment** (string): original text of comment ,
- **comment** (string): clean version of comment,
- **comment_tokens** (list): tokenized version of `comment`,
- **start_point** (int): start position of `original_comment` in `code`,
- **end_point** (int): end position of `original_comment` in `code`,
- **prev_context** (dict): block of code before `original_comment`,
- **next_context** (dict): block of code after `original_comment`
### Data Splits
In this repo, the inline level data is not split, and contained in only train set.
## Dataset Statistics
| Languages | Number of inline comments |
|:-----------|---------------------------:|
|Python | 14,013,238 |
|Java | 17,062,277 |
|JavaScript | 1,438,110 |
|PHP | 5,873,744 |
|C | 6,778,239 |
|C# | 6,274,389 |
|C++ | 10,343,650 |
|Go | 4,390,342 |
|Ruby | 767,563 |
|Rust | 2,063,784 |
|TOTAL | **69,005,336** |
## Usage
You can load The Vault dataset using datasets library: ```pip install datasets```
```python
from datasets import load_dataset
# Load full inline level dataset (69M samples)
dataset = load_dataset("Fsoft-AIC/the-vault-inline")
# specific language (e.g. Python)
dataset = load_dataset("Fsoft-AIC/the-vault-inline", languages=['Python'])
# dataset streaming
data = load_dataset("Fsoft-AIC/the-vault-inline", streaming= True)
for sample in iter(data['train']):
print(sample)
```
## Additional information
### Licensing Information
MIT License
### Citation Information
```
@article{manh2023vault,
title={The Vault: A Comprehensive Multilingual Dataset for Advancing Code Understanding and Generation},
author={Manh, Dung Nguyen and Hai, Nam Le and Dau, Anh TV and Nguyen, Anh Minh and Nghiem, Khanh and Guo, Jin and Bui, Nghi DQ},
journal={arXiv preprint arXiv:2305.06156},
year={2023}
}
```
### Contributions
This dataset is developed by [FSOFT AI4Code team](https://github.com/FSoft-AI4Code). |
DIBT/MPEP_FILIPINO | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for MPEP_FILIPINO
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("DIBT/MPEP_FILIPINO")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("DIBT/MPEP_FILIPINO")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| source | Source | text | True | True |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| target | Target | text | True | Translate the text. | N/A |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": "1752",
"fields": {
"source": "what role does a pass transistor and an op amp play in a regulator"
},
"metadata": {
"evolved_from": null,
"kind": "human",
"source": "ewof/sharegpt-instruct-unfiltered-deduped"
},
"responses": [
{
"status": "submitted",
"user_id": "7a0edea3-6cb5-44c9-8ecf-fdf7935cf064",
"values": {
"target": {
"value": "ano ang papel na ginagampanan ng pass transistor at op amp sa isang regulator"
}
}
}
],
"suggestions": [
{
"agent": null,
"question_name": "target",
"score": null,
"type": null,
"value": "Ano ang papel na ginagampanan ng isang pass transistor at op amp sa isang regulator\n\nAng papel ng isang pass transistor sa isang regulator ay upang i-regulate ang output voltage at kung minsan ay ginagamit upang mabawasan ang load ng regulator. Nagbibigay ito ng malalaking daloy ng kuryente na kinakailangan upang mapanatili ang steady output kahit na ang input o ang load ay nagbabago.\n\nSa kabilang banda, ang isang operational amplifier o op amp ay isang uri ng amplifier na maaaring mag-adjust at mag-control ng voltage sa isang circuit. Ginagamit ito sa isang regulator upang mai-maintain ang tamang kahalumigmigan sa loob ng system habang sinisiguro na ang mga komponitente ay hindi mahalata ng sobrang init o lamig. Bukod dito, ginagamit din ito upang mabawasan ang distorsyon at nagsisilbi rin bilang isang voltage follower."
}
],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": "1752",
"metadata": "{\"evolved_from\": null, \"kind\": \"human\", \"source\": \"ewof/sharegpt-instruct-unfiltered-deduped\"}",
"source": "what role does a pass transistor and an op amp play in a regulator",
"target": [
{
"status": "submitted",
"user_id": "7a0edea3-6cb5-44c9-8ecf-fdf7935cf064",
"value": "ano ang papel na ginagampanan ng pass transistor at op amp sa isang regulator"
}
],
"target-suggestion": "Ano ang papel na ginagampanan ng isang pass transistor at op amp sa isang regulator\n\nAng papel ng isang pass transistor sa isang regulator ay upang i-regulate ang output voltage at kung minsan ay ginagamit upang mabawasan ang load ng regulator. Nagbibigay ito ng malalaking daloy ng kuryente na kinakailangan upang mapanatili ang steady output kahit na ang input o ang load ay nagbabago.\n\nSa kabilang banda, ang isang operational amplifier o op amp ay isang uri ng amplifier na maaaring mag-adjust at mag-control ng voltage sa isang circuit. Ginagamit ito sa isang regulator upang mai-maintain ang tamang kahalumigmigan sa loob ng system habang sinisiguro na ang mga komponitente ay hindi mahalata ng sobrang init o lamig. Bukod dito, ginagamit din ito upang mabawasan ang distorsyon at nagsisilbi rin bilang isang voltage follower.",
"target-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **source** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **target** is of type `text`, and description "Translate the text.".
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **target-suggestion** is of type `text`.
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
This is a translation dataset that contains texts. Please translate the text in the text field.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
gsstein/25-baseline-dataset | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: prompt
dtype: string
- name: generated
dtype: bool
splits:
- name: train
num_bytes: 86360344
num_examples: 15326
- name: test
num_bytes: 3066458
num_examples: 576
- name: validation
num_bytes: 3264255
num_examples: 576
download_size: 57405958
dataset_size: 92691057
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
ibranze/araproje_arc_tr_s1 | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: validation
num_bytes: 86423.0
num_examples: 250
download_size: 46973
dataset_size: 86423.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_arc_tr_s1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jeffdshen/neqa0_8shot | ---
license: cc-by-2.0
---
|
Argen7um/restrant-qa | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
tags:
- legal
--- |
chenwangj/DexCap-Data | ---
license: cc-by-4.0
---
# Dataset Card for DexCap-Data
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Structure](#dataset-structure)
- [Dataset Creation](#dataset-creation)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://dex-cap.github.io/
- **Repository:** https://github.com/j96w/DexCap
- **Paper:** https://arxiv.org/abs/2403.07788
### Dataset Summary
This is the official dataset collected by the DexCap system to train dexterous robot manipulation using human hand motion capture data, as presented in the [paper](https://arxiv.org/abs/2403.07788). It contains 30 minutes of mocap data for the wiping task and 60 minutes of in-the-wild mocap data for the packaging task.
## Dataset Structure
Both raw data (`.zip`) and postprocessed data (`.hdf5`) are provided. The raw data is structured as follows:
```
save_data_scenario_1
├── frame_0
│ ├── color_image.jpg # Chest camera RGB image
│ ├── depth_image.png # Chest camera depth image
│ ├── pose.txt # Chest camera 6-DoF pose in world frame
│ ├── pose_2.txt # Left hand 6-DoF pose in world frame
│ ├── pose_3.txt # Right hand 6_DoF pose in world frame
│ ├── left_hand_joint.txt # Left hand joint positions (3D) in the palm frame
│ └── right_hand_joint.txt # Right hand joint positions (3D) in the palm frame
├── frame_1
└── ...
```
Details on how to postprocess the raw data into a training dataset (`.hdf5`) are introduced in the [GitHub repo](https://github.com/j96w/DexCap). The training scripts for policy learning with the HDF5 dataset are also included in the repo.
## Dataset Creation
All data are collected by the [DexCap system](https://dex-cap.github.io/).
## Additional Information
### Licensing Information
This dataset is released under the [Creative Commons Attribution 4.0 International (CC BY 4.0)](https://creativecommons.org/licenses/by/4.0/legalcode) license.
### Citation Information
```bibtex
@article{wang2024dexcap,
title = {DexCap: Scalable and Portable Mocap Data Collection System for Dexterous Manipulation},
author = {Wang, Chen and Shi, Haochen and Wang, Weizhuo and Zhang, Ruohan and Fei-Fei, Li and Liu, C. Karen},
journal = {arXiv preprint arXiv:2403.07788},
year = {2024}
}
``` |
tollefj/sts_altlex | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float32
splits:
- name: train
num_bytes: 30168028
num_examples: 112696
download_size: 20067804
dataset_size: 30168028
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SonicXtreme99/MikeySimonKappaMikey | ---
license: openrail
---
|
Emir292982/Mkmm | ---
license: openrail
---
|
JoAo352/modelokevin | ---
license: openrail
---
|
ricahrd/santo | ---
license: openrail
---
|
zouharvi/nmt-pe-effects | ---
license: cc
configs:
- config_name: default
data_files:
- split: phase_1
path: "phase_1.json"
- split: phase_2
path: "phase_2.json"
task_categories:
- translation
language:
- en
- cs
tags:
- post editing
- quality
size_categories:
- 1K<n<10K
---
# Neural Machine Translation Quality and Post-Editing Performance
This is a repository for an experiment relating NMT quality and post-editing efforts, presented at EMNLP2021 ([presentation recording](https://youtu.be/rCuoUbmJ5Uk)).
Please cite the following [paper](https://aclanthology.org/2021.emnlp-main.801/) when you use this research:
```
@inproceedings{zouhar2021neural,
title={Neural Machine Translation Quality and Post-Editing Performance},
author={Zouhar, Vil{\'e}m and Popel, Martin and Bojar, Ond{\v{r}}ej and Tamchyna, Ale{\v{s}}},
booktitle={Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing},
pages={10204--10214},
year={2021},
url={https://aclanthology.org/2021.emnlp-main.801/}
}
```
You can [access the data on huggingface](https://huggingface.co/datasets/zouharvi/nmt-pe-effects):
```python3
from datasets import load_dataset
# contains phase_1 and phase_2
data = load_dataset("zouharvi/nmt-pe-effects")
```
The first phase is the main one where we can see the effect of NMT quality on post-editing time.
The second phase is to estimate the quality of the first post-editing round.
The [code is also public](https://github.com/ufal/nmt-pe-effects-2021). |
taesiri/video-game-question-answering | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: string
- name: conversations
dtype: string
- name: model
dtype: string
splits:
- name: train
num_bytes: 27560807
num_examples: 36705
download_size: 9281942
dataset_size: 27560807
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jxu9001/custom_ontonotes5 | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: tags
sequence: int32
splits:
- name: train
num_bytes: 3773643
num_examples: 12195
- name: validation
num_bytes: 480047
num_examples: 1553
- name: test
num_bytes: 481250
num_examples: 1573
download_size: 0
dataset_size: 4734940
---
# Dataset Card for "custom_ontonotes5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AiresPucrs/MNIST-digit | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: label
dtype: int64
- name: 1x1
dtype: int64
- name: 1x2
dtype: int64
- name: 1x3
dtype: int64
- name: 1x4
dtype: int64
- name: 1x5
dtype: int64
- name: 1x6
dtype: int64
- name: 1x7
dtype: int64
- name: 1x8
dtype: int64
- name: 1x9
dtype: int64
- name: 1x10
dtype: int64
- name: 1x11
dtype: int64
- name: 1x12
dtype: int64
- name: 1x13
dtype: int64
- name: 1x14
dtype: int64
- name: 1x15
dtype: int64
- name: 1x16
dtype: int64
- name: 1x17
dtype: int64
- name: 1x18
dtype: int64
- name: 1x19
dtype: int64
- name: 1x20
dtype: int64
- name: 1x21
dtype: int64
- name: 1x22
dtype: int64
- name: 1x23
dtype: int64
- name: 1x24
dtype: int64
- name: 1x25
dtype: int64
- name: 1x26
dtype: int64
- name: 1x27
dtype: int64
- name: 1x28
dtype: int64
- name: 2x1
dtype: int64
- name: 2x2
dtype: int64
- name: 2x3
dtype: int64
- name: 2x4
dtype: int64
- name: 2x5
dtype: int64
- name: 2x6
dtype: int64
- name: 2x7
dtype: int64
- name: 2x8
dtype: int64
- name: 2x9
dtype: int64
- name: 2x10
dtype: int64
- name: 2x11
dtype: int64
- name: 2x12
dtype: int64
- name: 2x13
dtype: int64
- name: 2x14
dtype: int64
- name: 2x15
dtype: int64
- name: 2x16
dtype: int64
- name: 2x17
dtype: int64
- name: 2x18
dtype: int64
- name: 2x19
dtype: int64
- name: 2x20
dtype: int64
- name: 2x21
dtype: int64
- name: 2x22
dtype: int64
- name: 2x23
dtype: int64
- name: 2x24
dtype: int64
- name: 2x25
dtype: int64
- name: 2x26
dtype: int64
- name: 2x27
dtype: int64
- name: 2x28
dtype: int64
- name: 3x1
dtype: int64
- name: 3x2
dtype: int64
- name: 3x3
dtype: int64
- name: 3x4
dtype: int64
- name: 3x5
dtype: int64
- name: 3x6
dtype: int64
- name: 3x7
dtype: int64
- name: 3x8
dtype: int64
- name: 3x9
dtype: int64
- name: 3x10
dtype: int64
- name: 3x11
dtype: int64
- name: 3x12
dtype: int64
- name: 3x13
dtype: int64
- name: 3x14
dtype: int64
- name: 3x15
dtype: int64
- name: 3x16
dtype: int64
- name: 3x17
dtype: int64
- name: 3x18
dtype: int64
- name: 3x19
dtype: int64
- name: 3x20
dtype: int64
- name: 3x21
dtype: int64
- name: 3x22
dtype: int64
- name: 3x23
dtype: int64
- name: 3x24
dtype: int64
- name: 3x25
dtype: int64
- name: 3x26
dtype: int64
- name: 3x27
dtype: int64
- name: 3x28
dtype: int64
- name: 4x1
dtype: int64
- name: 4x2
dtype: int64
- name: 4x3
dtype: int64
- name: 4x4
dtype: int64
- name: 4x5
dtype: int64
- name: 4x6
dtype: int64
- name: 4x7
dtype: int64
- name: 4x8
dtype: int64
- name: 4x9
dtype: int64
- name: 4x10
dtype: int64
- name: 4x11
dtype: int64
- name: 4x12
dtype: int64
- name: 4x13
dtype: int64
- name: 4x14
dtype: int64
- name: 4x15
dtype: int64
- name: 4x16
dtype: int64
- name: 4x17
dtype: int64
- name: 4x18
dtype: int64
- name: 4x19
dtype: int64
- name: 4x20
dtype: int64
- name: 4x21
dtype: int64
- name: 4x22
dtype: int64
- name: 4x23
dtype: int64
- name: 4x24
dtype: int64
- name: 4x25
dtype: int64
- name: 4x26
dtype: int64
- name: 4x27
dtype: int64
- name: 4x28
dtype: int64
- name: 5x1
dtype: int64
- name: 5x2
dtype: int64
- name: 5x3
dtype: int64
- name: 5x4
dtype: int64
- name: 5x5
dtype: int64
- name: 5x6
dtype: int64
- name: 5x7
dtype: int64
- name: 5x8
dtype: int64
- name: 5x9
dtype: int64
- name: 5x10
dtype: int64
- name: 5x11
dtype: int64
- name: 5x12
dtype: int64
- name: 5x13
dtype: int64
- name: 5x14
dtype: int64
- name: 5x15
dtype: int64
- name: 5x16
dtype: int64
- name: 5x17
dtype: int64
- name: 5x18
dtype: int64
- name: 5x19
dtype: int64
- name: 5x20
dtype: int64
- name: 5x21
dtype: int64
- name: 5x22
dtype: int64
- name: 5x23
dtype: int64
- name: 5x24
dtype: int64
- name: 5x25
dtype: int64
- name: 5x26
dtype: int64
- name: 5x27
dtype: int64
- name: 5x28
dtype: int64
- name: 6x1
dtype: int64
- name: 6x2
dtype: int64
- name: 6x3
dtype: int64
- name: 6x4
dtype: int64
- name: 6x5
dtype: int64
- name: 6x6
dtype: int64
- name: 6x7
dtype: int64
- name: 6x8
dtype: int64
- name: 6x9
dtype: int64
- name: 6x10
dtype: int64
- name: 6x11
dtype: int64
- name: 6x12
dtype: int64
- name: 6x13
dtype: int64
- name: 6x14
dtype: int64
- name: 6x15
dtype: int64
- name: 6x16
dtype: int64
- name: 6x17
dtype: int64
- name: 6x18
dtype: int64
- name: 6x19
dtype: int64
- name: 6x20
dtype: int64
- name: 6x21
dtype: int64
- name: 6x22
dtype: int64
- name: 6x23
dtype: int64
- name: 6x24
dtype: int64
- name: 6x25
dtype: int64
- name: 6x26
dtype: int64
- name: 6x27
dtype: int64
- name: 6x28
dtype: int64
- name: 7x1
dtype: int64
- name: 7x2
dtype: int64
- name: 7x3
dtype: int64
- name: 7x4
dtype: int64
- name: 7x5
dtype: int64
- name: 7x6
dtype: int64
- name: 7x7
dtype: int64
- name: 7x8
dtype: int64
- name: 7x9
dtype: int64
- name: 7x10
dtype: int64
- name: 7x11
dtype: int64
- name: 7x12
dtype: int64
- name: 7x13
dtype: int64
- name: 7x14
dtype: int64
- name: 7x15
dtype: int64
- name: 7x16
dtype: int64
- name: 7x17
dtype: int64
- name: 7x18
dtype: int64
- name: 7x19
dtype: int64
- name: 7x20
dtype: int64
- name: 7x21
dtype: int64
- name: 7x22
dtype: int64
- name: 7x23
dtype: int64
- name: 7x24
dtype: int64
- name: 7x25
dtype: int64
- name: 7x26
dtype: int64
- name: 7x27
dtype: int64
- name: 7x28
dtype: int64
- name: 8x1
dtype: int64
- name: 8x2
dtype: int64
- name: 8x3
dtype: int64
- name: 8x4
dtype: int64
- name: 8x5
dtype: int64
- name: 8x6
dtype: int64
- name: 8x7
dtype: int64
- name: 8x8
dtype: int64
- name: 8x9
dtype: int64
- name: 8x10
dtype: int64
- name: 8x11
dtype: int64
- name: 8x12
dtype: int64
- name: 8x13
dtype: int64
- name: 8x14
dtype: int64
- name: 8x15
dtype: int64
- name: 8x16
dtype: int64
- name: 8x17
dtype: int64
- name: 8x18
dtype: int64
- name: 8x19
dtype: int64
- name: 8x20
dtype: int64
- name: 8x21
dtype: int64
- name: 8x22
dtype: int64
- name: 8x23
dtype: int64
- name: 8x24
dtype: int64
- name: 8x25
dtype: int64
- name: 8x26
dtype: int64
- name: 8x27
dtype: int64
- name: 8x28
dtype: int64
- name: 9x1
dtype: int64
- name: 9x2
dtype: int64
- name: 9x3
dtype: int64
- name: 9x4
dtype: int64
- name: 9x5
dtype: int64
- name: 9x6
dtype: int64
- name: 9x7
dtype: int64
- name: 9x8
dtype: int64
- name: 9x9
dtype: int64
- name: 9x10
dtype: int64
- name: 9x11
dtype: int64
- name: 9x12
dtype: int64
- name: 9x13
dtype: int64
- name: 9x14
dtype: int64
- name: 9x15
dtype: int64
- name: 9x16
dtype: int64
- name: 9x17
dtype: int64
- name: 9x18
dtype: int64
- name: 9x19
dtype: int64
- name: 9x20
dtype: int64
- name: 9x21
dtype: int64
- name: 9x22
dtype: int64
- name: 9x23
dtype: int64
- name: 9x24
dtype: int64
- name: 9x25
dtype: int64
- name: 9x26
dtype: int64
- name: 9x27
dtype: int64
- name: 9x28
dtype: int64
- name: 10x1
dtype: int64
- name: 10x2
dtype: int64
- name: 10x3
dtype: int64
- name: 10x4
dtype: int64
- name: 10x5
dtype: int64
- name: 10x6
dtype: int64
- name: 10x7
dtype: int64
- name: 10x8
dtype: int64
- name: 10x9
dtype: int64
- name: 10x10
dtype: int64
- name: 10x11
dtype: int64
- name: 10x12
dtype: int64
- name: 10x13
dtype: int64
- name: 10x14
dtype: int64
- name: 10x15
dtype: int64
- name: 10x16
dtype: int64
- name: 10x17
dtype: int64
- name: 10x18
dtype: int64
- name: 10x19
dtype: int64
- name: 10x20
dtype: int64
- name: 10x21
dtype: int64
- name: 10x22
dtype: int64
- name: 10x23
dtype: int64
- name: 10x24
dtype: int64
- name: 10x25
dtype: int64
- name: 10x26
dtype: int64
- name: 10x27
dtype: int64
- name: 10x28
dtype: int64
- name: 11x1
dtype: int64
- name: 11x2
dtype: int64
- name: 11x3
dtype: int64
- name: 11x4
dtype: int64
- name: 11x5
dtype: int64
- name: 11x6
dtype: int64
- name: 11x7
dtype: int64
- name: 11x8
dtype: int64
- name: 11x9
dtype: int64
- name: 11x10
dtype: int64
- name: 11x11
dtype: int64
- name: 11x12
dtype: int64
- name: 11x13
dtype: int64
- name: 11x14
dtype: int64
- name: 11x15
dtype: int64
- name: 11x16
dtype: int64
- name: 11x17
dtype: int64
- name: 11x18
dtype: int64
- name: 11x19
dtype: int64
- name: 11x20
dtype: int64
- name: 11x21
dtype: int64
- name: 11x22
dtype: int64
- name: 11x23
dtype: int64
- name: 11x24
dtype: int64
- name: 11x25
dtype: int64
- name: 11x26
dtype: int64
- name: 11x27
dtype: int64
- name: 11x28
dtype: int64
- name: 12x1
dtype: int64
- name: 12x2
dtype: int64
- name: 12x3
dtype: int64
- name: 12x4
dtype: int64
- name: 12x5
dtype: int64
- name: 12x6
dtype: int64
- name: 12x7
dtype: int64
- name: 12x8
dtype: int64
- name: 12x9
dtype: int64
- name: 12x10
dtype: int64
- name: 12x11
dtype: int64
- name: 12x12
dtype: int64
- name: 12x13
dtype: int64
- name: 12x14
dtype: int64
- name: 12x15
dtype: int64
- name: 12x16
dtype: int64
- name: 12x17
dtype: int64
- name: 12x18
dtype: int64
- name: 12x19
dtype: int64
- name: 12x20
dtype: int64
- name: 12x21
dtype: int64
- name: 12x22
dtype: int64
- name: 12x23
dtype: int64
- name: 12x24
dtype: int64
- name: 12x25
dtype: int64
- name: 12x26
dtype: int64
- name: 12x27
dtype: int64
- name: 12x28
dtype: int64
- name: 13x1
dtype: int64
- name: 13x2
dtype: int64
- name: 13x3
dtype: int64
- name: 13x4
dtype: int64
- name: 13x5
dtype: int64
- name: 13x6
dtype: int64
- name: 13x7
dtype: int64
- name: 13x8
dtype: int64
- name: 13x9
dtype: int64
- name: 13x10
dtype: int64
- name: 13x11
dtype: int64
- name: 13x12
dtype: int64
- name: 13x13
dtype: int64
- name: 13x14
dtype: int64
- name: 13x15
dtype: int64
- name: 13x16
dtype: int64
- name: 13x17
dtype: int64
- name: 13x18
dtype: int64
- name: 13x19
dtype: int64
- name: 13x20
dtype: int64
- name: 13x21
dtype: int64
- name: 13x22
dtype: int64
- name: 13x23
dtype: int64
- name: 13x24
dtype: int64
- name: 13x25
dtype: int64
- name: 13x26
dtype: int64
- name: 13x27
dtype: int64
- name: 13x28
dtype: int64
- name: 14x1
dtype: int64
- name: 14x2
dtype: int64
- name: 14x3
dtype: int64
- name: 14x4
dtype: int64
- name: 14x5
dtype: int64
- name: 14x6
dtype: int64
- name: 14x7
dtype: int64
- name: 14x8
dtype: int64
- name: 14x9
dtype: int64
- name: 14x10
dtype: int64
- name: 14x11
dtype: int64
- name: 14x12
dtype: int64
- name: 14x13
dtype: int64
- name: 14x14
dtype: int64
- name: 14x15
dtype: int64
- name: 14x16
dtype: int64
- name: 14x17
dtype: int64
- name: 14x18
dtype: int64
- name: 14x19
dtype: int64
- name: 14x20
dtype: int64
- name: 14x21
dtype: int64
- name: 14x22
dtype: int64
- name: 14x23
dtype: int64
- name: 14x24
dtype: int64
- name: 14x25
dtype: int64
- name: 14x26
dtype: int64
- name: 14x27
dtype: int64
- name: 14x28
dtype: int64
- name: 15x1
dtype: int64
- name: 15x2
dtype: int64
- name: 15x3
dtype: int64
- name: 15x4
dtype: int64
- name: 15x5
dtype: int64
- name: 15x6
dtype: int64
- name: 15x7
dtype: int64
- name: 15x8
dtype: int64
- name: 15x9
dtype: int64
- name: 15x10
dtype: int64
- name: 15x11
dtype: int64
- name: 15x12
dtype: int64
- name: 15x13
dtype: int64
- name: 15x14
dtype: int64
- name: 15x15
dtype: int64
- name: 15x16
dtype: int64
- name: 15x17
dtype: int64
- name: 15x18
dtype: int64
- name: 15x19
dtype: int64
- name: 15x20
dtype: int64
- name: 15x21
dtype: int64
- name: 15x22
dtype: int64
- name: 15x23
dtype: int64
- name: 15x24
dtype: int64
- name: 15x25
dtype: int64
- name: 15x26
dtype: int64
- name: 15x27
dtype: int64
- name: 15x28
dtype: int64
- name: 16x1
dtype: int64
- name: 16x2
dtype: int64
- name: 16x3
dtype: int64
- name: 16x4
dtype: int64
- name: 16x5
dtype: int64
- name: 16x6
dtype: int64
- name: 16x7
dtype: int64
- name: 16x8
dtype: int64
- name: 16x9
dtype: int64
- name: 16x10
dtype: int64
- name: 16x11
dtype: int64
- name: 16x12
dtype: int64
- name: 16x13
dtype: int64
- name: 16x14
dtype: int64
- name: 16x15
dtype: int64
- name: 16x16
dtype: int64
- name: 16x17
dtype: int64
- name: 16x18
dtype: int64
- name: 16x19
dtype: int64
- name: 16x20
dtype: int64
- name: 16x21
dtype: int64
- name: 16x22
dtype: int64
- name: 16x23
dtype: int64
- name: 16x24
dtype: int64
- name: 16x25
dtype: int64
- name: 16x26
dtype: int64
- name: 16x27
dtype: int64
- name: 16x28
dtype: int64
- name: 17x1
dtype: int64
- name: 17x2
dtype: int64
- name: 17x3
dtype: int64
- name: 17x4
dtype: int64
- name: 17x5
dtype: int64
- name: 17x6
dtype: int64
- name: 17x7
dtype: int64
- name: 17x8
dtype: int64
- name: 17x9
dtype: int64
- name: 17x10
dtype: int64
- name: 17x11
dtype: int64
- name: 17x12
dtype: int64
- name: 17x13
dtype: int64
- name: 17x14
dtype: int64
- name: 17x15
dtype: int64
- name: 17x16
dtype: int64
- name: 17x17
dtype: int64
- name: 17x18
dtype: int64
- name: 17x19
dtype: int64
- name: 17x20
dtype: int64
- name: 17x21
dtype: int64
- name: 17x22
dtype: int64
- name: 17x23
dtype: int64
- name: 17x24
dtype: int64
- name: 17x25
dtype: int64
- name: 17x26
dtype: int64
- name: 17x27
dtype: int64
- name: 17x28
dtype: int64
- name: 18x1
dtype: int64
- name: 18x2
dtype: int64
- name: 18x3
dtype: int64
- name: 18x4
dtype: int64
- name: 18x5
dtype: int64
- name: 18x6
dtype: int64
- name: 18x7
dtype: int64
- name: 18x8
dtype: int64
- name: 18x9
dtype: int64
- name: 18x10
dtype: int64
- name: 18x11
dtype: int64
- name: 18x12
dtype: int64
- name: 18x13
dtype: int64
- name: 18x14
dtype: int64
- name: 18x15
dtype: int64
- name: 18x16
dtype: int64
- name: 18x17
dtype: int64
- name: 18x18
dtype: int64
- name: 18x19
dtype: int64
- name: 18x20
dtype: int64
- name: 18x21
dtype: int64
- name: 18x22
dtype: int64
- name: 18x23
dtype: int64
- name: 18x24
dtype: int64
- name: 18x25
dtype: int64
- name: 18x26
dtype: int64
- name: 18x27
dtype: int64
- name: 18x28
dtype: int64
- name: 19x1
dtype: int64
- name: 19x2
dtype: int64
- name: 19x3
dtype: int64
- name: 19x4
dtype: int64
- name: 19x5
dtype: int64
- name: 19x6
dtype: int64
- name: 19x7
dtype: int64
- name: 19x8
dtype: int64
- name: 19x9
dtype: int64
- name: 19x10
dtype: int64
- name: 19x11
dtype: int64
- name: 19x12
dtype: int64
- name: 19x13
dtype: int64
- name: 19x14
dtype: int64
- name: 19x15
dtype: int64
- name: 19x16
dtype: int64
- name: 19x17
dtype: int64
- name: 19x18
dtype: int64
- name: 19x19
dtype: int64
- name: 19x20
dtype: int64
- name: 19x21
dtype: int64
- name: 19x22
dtype: int64
- name: 19x23
dtype: int64
- name: 19x24
dtype: int64
- name: 19x25
dtype: int64
- name: 19x26
dtype: int64
- name: 19x27
dtype: int64
- name: 19x28
dtype: int64
- name: 20x1
dtype: int64
- name: 20x2
dtype: int64
- name: 20x3
dtype: int64
- name: 20x4
dtype: int64
- name: 20x5
dtype: int64
- name: 20x6
dtype: int64
- name: 20x7
dtype: int64
- name: 20x8
dtype: int64
- name: 20x9
dtype: int64
- name: 20x10
dtype: int64
- name: 20x11
dtype: int64
- name: 20x12
dtype: int64
- name: 20x13
dtype: int64
- name: 20x14
dtype: int64
- name: 20x15
dtype: int64
- name: 20x16
dtype: int64
- name: 20x17
dtype: int64
- name: 20x18
dtype: int64
- name: 20x19
dtype: int64
- name: 20x20
dtype: int64
- name: 20x21
dtype: int64
- name: 20x22
dtype: int64
- name: 20x23
dtype: int64
- name: 20x24
dtype: int64
- name: 20x25
dtype: int64
- name: 20x26
dtype: int64
- name: 20x27
dtype: int64
- name: 20x28
dtype: int64
- name: 21x1
dtype: int64
- name: 21x2
dtype: int64
- name: 21x3
dtype: int64
- name: 21x4
dtype: int64
- name: 21x5
dtype: int64
- name: 21x6
dtype: int64
- name: 21x7
dtype: int64
- name: 21x8
dtype: int64
- name: 21x9
dtype: int64
- name: 21x10
dtype: int64
- name: 21x11
dtype: int64
- name: 21x12
dtype: int64
- name: 21x13
dtype: int64
- name: 21x14
dtype: int64
- name: 21x15
dtype: int64
- name: 21x16
dtype: int64
- name: 21x17
dtype: int64
- name: 21x18
dtype: int64
- name: 21x19
dtype: int64
- name: 21x20
dtype: int64
- name: 21x21
dtype: int64
- name: 21x22
dtype: int64
- name: 21x23
dtype: int64
- name: 21x24
dtype: int64
- name: 21x25
dtype: int64
- name: 21x26
dtype: int64
- name: 21x27
dtype: int64
- name: 21x28
dtype: int64
- name: 22x1
dtype: int64
- name: 22x2
dtype: int64
- name: 22x3
dtype: int64
- name: 22x4
dtype: int64
- name: 22x5
dtype: int64
- name: 22x6
dtype: int64
- name: 22x7
dtype: int64
- name: 22x8
dtype: int64
- name: 22x9
dtype: int64
- name: 22x10
dtype: int64
- name: 22x11
dtype: int64
- name: 22x12
dtype: int64
- name: 22x13
dtype: int64
- name: 22x14
dtype: int64
- name: 22x15
dtype: int64
- name: 22x16
dtype: int64
- name: 22x17
dtype: int64
- name: 22x18
dtype: int64
- name: 22x19
dtype: int64
- name: 22x20
dtype: int64
- name: 22x21
dtype: int64
- name: 22x22
dtype: int64
- name: 22x23
dtype: int64
- name: 22x24
dtype: int64
- name: 22x25
dtype: int64
- name: 22x26
dtype: int64
- name: 22x27
dtype: int64
- name: 22x28
dtype: int64
- name: 23x1
dtype: int64
- name: 23x2
dtype: int64
- name: 23x3
dtype: int64
- name: 23x4
dtype: int64
- name: 23x5
dtype: int64
- name: 23x6
dtype: int64
- name: 23x7
dtype: int64
- name: 23x8
dtype: int64
- name: 23x9
dtype: int64
- name: 23x10
dtype: int64
- name: 23x11
dtype: int64
- name: 23x12
dtype: int64
- name: 23x13
dtype: int64
- name: 23x14
dtype: int64
- name: 23x15
dtype: int64
- name: 23x16
dtype: int64
- name: 23x17
dtype: int64
- name: 23x18
dtype: int64
- name: 23x19
dtype: int64
- name: 23x20
dtype: int64
- name: 23x21
dtype: int64
- name: 23x22
dtype: int64
- name: 23x23
dtype: int64
- name: 23x24
dtype: int64
- name: 23x25
dtype: int64
- name: 23x26
dtype: int64
- name: 23x27
dtype: int64
- name: 23x28
dtype: int64
- name: 24x1
dtype: int64
- name: 24x2
dtype: int64
- name: 24x3
dtype: int64
- name: 24x4
dtype: int64
- name: 24x5
dtype: int64
- name: 24x6
dtype: int64
- name: 24x7
dtype: int64
- name: 24x8
dtype: int64
- name: 24x9
dtype: int64
- name: 24x10
dtype: int64
- name: 24x11
dtype: int64
- name: 24x12
dtype: int64
- name: 24x13
dtype: int64
- name: 24x14
dtype: int64
- name: 24x15
dtype: int64
- name: 24x16
dtype: int64
- name: 24x17
dtype: int64
- name: 24x18
dtype: int64
- name: 24x19
dtype: int64
- name: 24x20
dtype: int64
- name: 24x21
dtype: int64
- name: 24x22
dtype: int64
- name: 24x23
dtype: int64
- name: 24x24
dtype: int64
- name: 24x25
dtype: int64
- name: 24x26
dtype: int64
- name: 24x27
dtype: int64
- name: 24x28
dtype: int64
- name: 25x1
dtype: int64
- name: 25x2
dtype: int64
- name: 25x3
dtype: int64
- name: 25x4
dtype: int64
- name: 25x5
dtype: int64
- name: 25x6
dtype: int64
- name: 25x7
dtype: int64
- name: 25x8
dtype: int64
- name: 25x9
dtype: int64
- name: 25x10
dtype: int64
- name: 25x11
dtype: int64
- name: 25x12
dtype: int64
- name: 25x13
dtype: int64
- name: 25x14
dtype: int64
- name: 25x15
dtype: int64
- name: 25x16
dtype: int64
- name: 25x17
dtype: int64
- name: 25x18
dtype: int64
- name: 25x19
dtype: int64
- name: 25x20
dtype: int64
- name: 25x21
dtype: int64
- name: 25x22
dtype: int64
- name: 25x23
dtype: int64
- name: 25x24
dtype: int64
- name: 25x25
dtype: int64
- name: 25x26
dtype: int64
- name: 25x27
dtype: int64
- name: 25x28
dtype: int64
- name: 26x1
dtype: int64
- name: 26x2
dtype: int64
- name: 26x3
dtype: int64
- name: 26x4
dtype: int64
- name: 26x5
dtype: int64
- name: 26x6
dtype: int64
- name: 26x7
dtype: int64
- name: 26x8
dtype: int64
- name: 26x9
dtype: int64
- name: 26x10
dtype: int64
- name: 26x11
dtype: int64
- name: 26x12
dtype: int64
- name: 26x13
dtype: int64
- name: 26x14
dtype: int64
- name: 26x15
dtype: int64
- name: 26x16
dtype: int64
- name: 26x17
dtype: int64
- name: 26x18
dtype: int64
- name: 26x19
dtype: int64
- name: 26x20
dtype: int64
- name: 26x21
dtype: int64
- name: 26x22
dtype: int64
- name: 26x23
dtype: int64
- name: 26x24
dtype: int64
- name: 26x25
dtype: int64
- name: 26x26
dtype: int64
- name: 26x27
dtype: int64
- name: 26x28
dtype: int64
- name: 27x1
dtype: int64
- name: 27x2
dtype: int64
- name: 27x3
dtype: int64
- name: 27x4
dtype: int64
- name: 27x5
dtype: int64
- name: 27x6
dtype: int64
- name: 27x7
dtype: int64
- name: 27x8
dtype: int64
- name: 27x9
dtype: int64
- name: 27x10
dtype: int64
- name: 27x11
dtype: int64
- name: 27x12
dtype: int64
- name: 27x13
dtype: int64
- name: 27x14
dtype: int64
- name: 27x15
dtype: int64
- name: 27x16
dtype: int64
- name: 27x17
dtype: int64
- name: 27x18
dtype: int64
- name: 27x19
dtype: int64
- name: 27x20
dtype: int64
- name: 27x21
dtype: int64
- name: 27x22
dtype: int64
- name: 27x23
dtype: int64
- name: 27x24
dtype: int64
- name: 27x25
dtype: int64
- name: 27x26
dtype: int64
- name: 27x27
dtype: int64
- name: 27x28
dtype: int64
- name: 28x1
dtype: int64
- name: 28x2
dtype: int64
- name: 28x3
dtype: int64
- name: 28x4
dtype: int64
- name: 28x5
dtype: int64
- name: 28x6
dtype: int64
- name: 28x7
dtype: int64
- name: 28x8
dtype: int64
- name: 28x9
dtype: int64
- name: 28x10
dtype: int64
- name: 28x11
dtype: int64
- name: 28x12
dtype: int64
- name: 28x13
dtype: int64
- name: 28x14
dtype: int64
- name: 28x15
dtype: int64
- name: 28x16
dtype: int64
- name: 28x17
dtype: int64
- name: 28x18
dtype: int64
- name: 28x19
dtype: int64
- name: 28x20
dtype: int64
- name: 28x21
dtype: int64
- name: 28x22
dtype: int64
- name: 28x23
dtype: int64
- name: 28x24
dtype: int64
- name: 28x25
dtype: int64
- name: 28x26
dtype: int64
- name: 28x27
dtype: int64
- name: 28x28
dtype: int64
splits:
- name: train
num_bytes: 376800000
num_examples: 60000
- name: test
num_bytes: 62800000
num_examples: 10000
download_size: 55424916
dataset_size: 439600000
pretty_name: MNIST-digit
size_categories:
- 10K<n<100K
---
# MNIST-digit
## Overview
The MNIST dataset is a widely used dataset in the field of machine learning and computer vision.
This dataset is a handwritten digit (0-9).
It's used for tasks like digit recognition, where the goal is to correctly identify the digit (0-9) from the image.
## Dataset Details
The [MNIST](https://www.tensorflow.org/datasets/catalog/mnist) (Modified National Institute of Standards and Technology) dataset
was introduced in a paper by Yann LeCun, Corinna Cortes, and CJ Burges in 2010:
```latex
@article{lecun2010mnist,
title={MNIST handwritten digit database},
author={LeCun, Yann and Cortes, Corinna and Burges, CJ},
journal={ATT Labs [Online]. Available: http://yann.lecun.com/exdb/mnist},
volume={2},
year={2010}
}
```
## Contents
The dataset consists of 60,000 training images and 10,000 testing images. Each image is a grayscale representation of a
handwritten digit, with a resolution of 28x28 pixels.
## How to use
```python
from datasets import load_dataset
dataset = load_dataset("AiresPucrs/MNIST-digit", split='train')
```
## License
The dataset is licensed under the Data files © Original Authors.
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_138 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1136548116.0
num_examples: 221463
download_size: 1163957331
dataset_size: 1136548116.0
---
# Dataset Card for "chunk_138"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eerichmondspam/sourceformer-data | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 9741728
num_examples: 1577
download_size: 5604518
dataset_size: 9741728
---
# Dataset Card for "sourceformer-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pere/sami_parallel | ---
license: apache-2.0
---
|
ShynBui/vi_term_definition | ---
task_categories:
- text-classification
- zero-shot-classification
language:
- vi
tags:
- legal
pretty_name: Dataset for vietnamese term definition classification
size_categories:
- 10K<n<100K
--- |
james-burton/OrientalMuseum_min5-white-mat | ---
dataset_info:
features:
- name: obj_num
dtype: string
- name: file
dtype: string
- name: image
dtype: image
- name: root
dtype: string
- name: description
dtype: string
- name: object_name
dtype: string
- name: other_name
dtype: string
- name: label
dtype:
class_label:
names:
'0': Animal Mummy
'1': Batik
'2': Buffalo Horn
'3': Chinese Red Rosewood
'4': Colour on Paper
'5': Flint/Chert
'6': Gouache on Paper
'7': Haematite/Red Ochre
'8': Human Bone
'9': Ink and Colour on Paper
'10': Ink and Colours on Silk
'11': Ink and Opaque Watercolour on Paper
'12': Ink on Paper
'13': Jade (Calcified)
'14': Japanese paper
'15': Microcline/Green Feldspar/Amazon-Stone
'16': Nile Mud
'17': Opaque Watercolour on Paper
'18': Opaque Watercolour or Gouache on Mica
'19': Pith
'20': Pith Paper
'21': Plant Product
'22': Resin/Plastic
'23': Rhinoceros Horn
'24': Smaragdite
'25': Steatite
'26': Steatite/Soap Stone
'27': Watercolour on Rice Paper
'28': acrylic
'29': agate
'30': alabaster
'31': aluminum
'32': amber
'33': amethyst
'34': antler
'35': artificial stone
'36': balsa
'37': bamboo
'38': basalt
'39': bone
'40': bowenite
'41': boxwood
'42': brass
'43': brocade
'44': bronze
'45': burnt jade
'46': canvas
'47': cardboard
'48': cards
'49': carnelian
'50': cast iron
'51': celadon
'52': cellulose acetate
'53': ceramic
'54': chalcedony
'55': cherry
'56': clay
'57': cloth
'58': coconut
'59': copper
'60': copper alloy
'61': coral
'62': cotton
'63': crystal
'64': diorite
'65': dolerite
'66': earthenware
'67': ebony
'68': emerald
'69': enamel
'70': faience
'71': felt
'72': flax
'73': flint
'74': gauze
'75': glass
'76': gold
'77': granite
'78': gray ware
'79': hardwood
'80': horn
'81': incense
'82': ink
'83': iron
'84': ivory
'85': jade
'86': jadeite
'87': jasper
'88': lacquer
'89': lapis lazuli
'90': lazurite
'91': lead
'92': lead alloy
'93': leather
'94': limestone
'95': linen
'96': malachite
'97': marble
'98': metal
'99': mineral
'100': mother of pearl
'101': muslin
'102': nephrite
'103': nylon
'104': obsidian
'105': organic material
'106': paint
'107': palm fiber
'108': palm leaf
'109': paper
'110': papier mâché
'111': papyrus
'112': pewter
'113': photographic paper
'114': pine
'115': plant fiber
'116': plaster
'117': plastic
'118': plate
'119': polyester
'120': polystyrene
'121': porcelain
'122': pottery
'123': quartzite
'124': rattan
'125': realgar
'126': reed
'127': rice paper
'128': rock
'129': rush
'130': sandstone
'131': satin
'132': schist
'133': seashell
'134': serpentine
'135': shell
'136': silk
'137': siltstone
'138': silver
'139': skull
'140': slate
'141': soapstone
'142': softwood
'143': stalagmites
'144': steel
'145': stone
'146': stoneware
'147': straw
'148': stucco
'149': sycamore
'150': synthetic fiber
'151': teak
'152': terracotta
'153': textiles
'154': tin
'155': tortoise shell
'156': tourmaline
'157': travertine
'158': tremolite
'159': turquoise
'160': velvet
'161': wood
'162': wool
'163': wrought iron
'164': zinc alloy
- name: production.period
dtype: string
- name: production.place
dtype: string
- name: new_root
dtype: string
splits:
- name: train
num_bytes: 741469047.96
num_examples: 23060
- name: validation
num_bytes: 168672680.74
num_examples: 5426
- name: test
num_bytes: 137567256.474
num_examples: 5426
download_size: 950282711
dataset_size: 1047708985.174
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
yagmurx/ataturk_voice_restorated | ---
license: unknown
---
|
bigbio/bio_sim_verb |
---
language:
- en
bigbio_language:
- English
license: unknown
multilinguality: monolingual
bigbio_license_shortname: UNKNOWN
pretty_name: Bio-SimVerb
homepage: https://github.com/cambridgeltl/bio-simverb
bigbio_pubmed: True
bigbio_public: True
bigbio_tasks:
- SEMANTIC_SIMILARITY
---
# Dataset Card for Bio-SimVerb
## Dataset Description
- **Homepage:** https://github.com/cambridgeltl/bio-simverb
- **Pubmed:** True
- **Public:** True
- **Tasks:** STS
This repository contains the evaluation datasets for the paper Bio-SimVerb and Bio-SimLex: Wide-coverage Evaluation Sets of Word Similarity in Biomedicine by Billy Chiu, Sampo Pyysalo and Anna Korhonen.
## Citation Information
```
@article{article,
title = {
Bio-SimVerb and Bio-SimLex: Wide-coverage evaluation sets of word
similarity in biomedicine
},
author = {Chiu, Billy and Pyysalo, Sampo and Vulić, Ivan and Korhonen, Anna},
year = 2018,
month = {02},
journal = {BMC Bioinformatics},
volume = 19,
pages = {},
doi = {10.1186/s12859-018-2039-z}
}
```
|
Artples/DS-2 | ---
license: apache-2.0
task_categories:
- text-to-image
tags:
- Unsplash
- LAI
pretty_name: LAI-Dataset 2
---
# Dataset Card for LAI-DS2
This Dataset is a dataset with pictures from Unsplash with great descriptions to the images.
### Dataset Description
This dataset comprises a collection of Unsplash images accompanied by detailed descriptions.
- **Curated by:** Artur Lauche
- **License:** Apache-2.0
### Dataset Sources [optional]
All images are from Unsplash and the descriptions are made by licensed AI.
## Uses
Image related task.
### Out-of-Scope Use
There is no NSFW content in the Dataset.
### Curation Rationale
For the training of my own Text-to-Image Model
#### Data Collection and Processing
All Images have been filtered by a Human.
#### Personal and Sensitive Information
The Dataset contains Images of people including their face, but the images are still licensed and so useable. |
ruliad/factual-expert-processed-test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3127554
num_examples: 100
download_size: 1812130
dataset_size: 3127554
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Miron/NLP_1 | ---
dataset_info:
features:
- name: Science artilce's texts
dtype: string
- name: text_length
dtype: int64
- name: TEXT
dtype: string
splits:
- name: train
num_bytes: 54709956.09102402
num_examples: 711
- name: validation
num_bytes: 6155831.908975979
num_examples: 80
download_size: 26356400
dataset_size: 60865788.0
---
# Dataset Card for "Text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/scitldr | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 4016919
num_examples: 3229
download_size: 2222180
dataset_size: 4016919
---
# Dataset Card for "scitldr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_WhiteRabbitNeo__WhiteRabbitNeo-13B-v1 | ---
pretty_name: Evaluation run of WhiteRabbitNeo/WhiteRabbitNeo-13B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [WhiteRabbitNeo/WhiteRabbitNeo-13B-v1](https://huggingface.co/WhiteRabbitNeo/WhiteRabbitNeo-13B-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WhiteRabbitNeo__WhiteRabbitNeo-13B-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-19T16:51:00.125160](https://huggingface.co/datasets/open-llm-leaderboard/details_WhiteRabbitNeo__WhiteRabbitNeo-13B-v1/blob/main/results_2024-01-19T16-51-00.125160.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4325743019051002,\n\
\ \"acc_stderr\": 0.03450564854492944,\n \"acc_norm\": 0.4356434201033021,\n\
\ \"acc_norm_stderr\": 0.03525272782306864,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.44577231939553535,\n\
\ \"mc2_stderr\": 0.014884190006288057\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4462457337883959,\n \"acc_stderr\": 0.014526705548539982,\n\
\ \"acc_norm\": 0.4854948805460751,\n \"acc_norm_stderr\": 0.014605241081370056\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5126468830910177,\n\
\ \"acc_stderr\": 0.0049881849883452855,\n \"acc_norm\": 0.6870145389364668,\n\
\ \"acc_norm_stderr\": 0.004627607991626908\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.040179012759817494,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.040179012759817494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4377358490566038,\n \"acc_stderr\": 0.03053333843046751,\n\
\ \"acc_norm\": 0.4377358490566038,\n \"acc_norm_stderr\": 0.03053333843046751\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3958333333333333,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.3958333333333333,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3815028901734104,\n\
\ \"acc_stderr\": 0.037038511930995194,\n \"acc_norm\": 0.3815028901734104,\n\
\ \"acc_norm_stderr\": 0.037038511930995194\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.043036840335373146,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.043036840335373146\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29894179894179895,\n \"acc_stderr\": 0.02357760479165581,\n \"\
acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.02357760479165581\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3774193548387097,\n \"acc_stderr\": 0.027575960723278236,\n \"\
acc_norm\": 0.3774193548387097,\n \"acc_norm_stderr\": 0.027575960723278236\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3399014778325123,\n \"acc_stderr\": 0.033327690684107895,\n \"\
acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.033327690684107895\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.03851716319398395,\n\
\ \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.03851716319398395\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5454545454545454,\n \"acc_stderr\": 0.035476014940069384,\n \"\
acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.035476014940069384\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5440414507772021,\n \"acc_stderr\": 0.03594413711272437,\n\
\ \"acc_norm\": 0.5440414507772021,\n \"acc_norm_stderr\": 0.03594413711272437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3435897435897436,\n \"acc_stderr\": 0.024078696580635474,\n\
\ \"acc_norm\": 0.3435897435897436,\n \"acc_norm_stderr\": 0.024078696580635474\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3949579831932773,\n \"acc_stderr\": 0.031753678460966245,\n\
\ \"acc_norm\": 0.3949579831932773,\n \"acc_norm_stderr\": 0.031753678460966245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.544954128440367,\n \"acc_stderr\": 0.02135050309092517,\n \"acc_norm\"\
: 0.544954128440367,\n \"acc_norm_stderr\": 0.02135050309092517\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.35185185185185186,\n\
\ \"acc_stderr\": 0.032568505702936464,\n \"acc_norm\": 0.35185185185185186,\n\
\ \"acc_norm_stderr\": 0.032568505702936464\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6127450980392157,\n \"acc_stderr\": 0.03418931233833343,\n\
\ \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.03418931233833343\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6329113924050633,\n \"acc_stderr\": 0.031376240725616185,\n \
\ \"acc_norm\": 0.6329113924050633,\n \"acc_norm_stderr\": 0.031376240725616185\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4618834080717489,\n\
\ \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.4618834080717489,\n\
\ \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3969465648854962,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.3969465648854962,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.5092592592592593,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49693251533742333,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.49693251533742333,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5048543689320388,\n \"acc_stderr\": 0.049505043821289195,\n\
\ \"acc_norm\": 0.5048543689320388,\n \"acc_norm_stderr\": 0.049505043821289195\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6752136752136753,\n\
\ \"acc_stderr\": 0.03067902276549883,\n \"acc_norm\": 0.6752136752136753,\n\
\ \"acc_norm_stderr\": 0.03067902276549883\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5466155810983397,\n\
\ \"acc_stderr\": 0.0178020871358503,\n \"acc_norm\": 0.5466155810983397,\n\
\ \"acc_norm_stderr\": 0.0178020871358503\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.026842985519615375,\n\
\ \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.026842985519615375\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n\
\ \"acc_stderr\": 0.014950103002475353,\n \"acc_norm\": 0.2759776536312849,\n\
\ \"acc_norm_stderr\": 0.014950103002475353\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.028275490156791434,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.028275490156791434\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4662379421221865,\n\
\ \"acc_stderr\": 0.028333277109562783,\n \"acc_norm\": 0.4662379421221865,\n\
\ \"acc_norm_stderr\": 0.028333277109562783\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.027801656212323674,\n\
\ \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.027801656212323674\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3285528031290743,\n\
\ \"acc_stderr\": 0.01199602724750291,\n \"acc_norm\": 0.3285528031290743,\n\
\ \"acc_norm_stderr\": 0.01199602724750291\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.39215686274509803,\n \"acc_stderr\": 0.019751726508762626,\n \
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.019751726508762626\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4909090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.4909090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5469387755102041,\n \"acc_stderr\": 0.03186785930004129,\n\
\ \"acc_norm\": 0.5469387755102041,\n \"acc_norm_stderr\": 0.03186785930004129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.48756218905472637,\n\
\ \"acc_stderr\": 0.0353443984853958,\n \"acc_norm\": 0.48756218905472637,\n\
\ \"acc_norm_stderr\": 0.0353443984853958\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.03829509868994727,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.03829509868994727\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.44577231939553535,\n\
\ \"mc2_stderr\": 0.014884190006288057\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6740331491712708,\n \"acc_stderr\": 0.013173782636922187\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.22365428354814254,\n \
\ \"acc_stderr\": 0.011477795578836105\n }\n}\n```"
repo_url: https://huggingface.co/WhiteRabbitNeo/WhiteRabbitNeo-13B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|arc:challenge|25_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|gsm8k|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hellaswag|10_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T16-51-00.125160.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T16-51-00.125160.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- '**/details_harness|winogrande|5_2024-01-19T16-51-00.125160.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-19T16-51-00.125160.parquet'
- config_name: results
data_files:
- split: 2024_01_19T16_51_00.125160
path:
- results_2024-01-19T16-51-00.125160.parquet
- split: latest
path:
- results_2024-01-19T16-51-00.125160.parquet
---
# Dataset Card for Evaluation run of WhiteRabbitNeo/WhiteRabbitNeo-13B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [WhiteRabbitNeo/WhiteRabbitNeo-13B-v1](https://huggingface.co/WhiteRabbitNeo/WhiteRabbitNeo-13B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WhiteRabbitNeo__WhiteRabbitNeo-13B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-19T16:51:00.125160](https://huggingface.co/datasets/open-llm-leaderboard/details_WhiteRabbitNeo__WhiteRabbitNeo-13B-v1/blob/main/results_2024-01-19T16-51-00.125160.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4325743019051002,
"acc_stderr": 0.03450564854492944,
"acc_norm": 0.4356434201033021,
"acc_norm_stderr": 0.03525272782306864,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.44577231939553535,
"mc2_stderr": 0.014884190006288057
},
"harness|arc:challenge|25": {
"acc": 0.4462457337883959,
"acc_stderr": 0.014526705548539982,
"acc_norm": 0.4854948805460751,
"acc_norm_stderr": 0.014605241081370056
},
"harness|hellaswag|10": {
"acc": 0.5126468830910177,
"acc_stderr": 0.0049881849883452855,
"acc_norm": 0.6870145389364668,
"acc_norm_stderr": 0.004627607991626908
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4377358490566038,
"acc_stderr": 0.03053333843046751,
"acc_norm": 0.4377358490566038,
"acc_norm_stderr": 0.03053333843046751
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3958333333333333,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.3958333333333333,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3815028901734104,
"acc_stderr": 0.037038511930995194,
"acc_norm": 0.3815028901734104,
"acc_norm_stderr": 0.037038511930995194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.043036840335373146,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.043036840335373146
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.02357760479165581,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.02357760479165581
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3774193548387097,
"acc_stderr": 0.027575960723278236,
"acc_norm": 0.3774193548387097,
"acc_norm_stderr": 0.027575960723278236
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.03851716319398395,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.03851716319398395
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.035476014940069384,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.035476014940069384
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5440414507772021,
"acc_stderr": 0.03594413711272437,
"acc_norm": 0.5440414507772021,
"acc_norm_stderr": 0.03594413711272437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3435897435897436,
"acc_stderr": 0.024078696580635474,
"acc_norm": 0.3435897435897436,
"acc_norm_stderr": 0.024078696580635474
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3949579831932773,
"acc_stderr": 0.031753678460966245,
"acc_norm": 0.3949579831932773,
"acc_norm_stderr": 0.031753678460966245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.544954128440367,
"acc_stderr": 0.02135050309092517,
"acc_norm": 0.544954128440367,
"acc_norm_stderr": 0.02135050309092517
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.032568505702936464,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.032568505702936464
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.03418931233833343,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.03418931233833343
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6329113924050633,
"acc_stderr": 0.031376240725616185,
"acc_norm": 0.6329113924050633,
"acc_norm_stderr": 0.031376240725616185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4618834080717489,
"acc_stderr": 0.03346015011973228,
"acc_norm": 0.4618834080717489,
"acc_norm_stderr": 0.03346015011973228
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3969465648854962,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.3969465648854962,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49693251533742333,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.49693251533742333,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.5048543689320388,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.5048543689320388,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6752136752136753,
"acc_stderr": 0.03067902276549883,
"acc_norm": 0.6752136752136753,
"acc_norm_stderr": 0.03067902276549883
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5466155810983397,
"acc_stderr": 0.0178020871358503,
"acc_norm": 0.5466155810983397,
"acc_norm_stderr": 0.0178020871358503
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475353,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475353
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.028275490156791434,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.028275490156791434
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4662379421221865,
"acc_stderr": 0.028333277109562783,
"acc_norm": 0.4662379421221865,
"acc_norm_stderr": 0.028333277109562783
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.027801656212323674,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.027801656212323674
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251458,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3285528031290743,
"acc_stderr": 0.01199602724750291,
"acc_norm": 0.3285528031290743,
"acc_norm_stderr": 0.01199602724750291
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3014705882352941,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.3014705882352941,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.019751726508762626,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.019751726508762626
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5469387755102041,
"acc_stderr": 0.03186785930004129,
"acc_norm": 0.5469387755102041,
"acc_norm_stderr": 0.03186785930004129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.48756218905472637,
"acc_stderr": 0.0353443984853958,
"acc_norm": 0.48756218905472637,
"acc_norm_stderr": 0.0353443984853958
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.03829509868994727,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.03829509868994727
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.44577231939553535,
"mc2_stderr": 0.014884190006288057
},
"harness|winogrande|5": {
"acc": 0.6740331491712708,
"acc_stderr": 0.013173782636922187
},
"harness|gsm8k|5": {
"acc": 0.22365428354814254,
"acc_stderr": 0.011477795578836105
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
737-Lab/WebUiImprove | ---
license: openrail
---
|
567-labs/wikipedia-embedding-bge-base-en-v1.5-sample | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: embedding
sequence: float64
splits:
- name: train
num_bytes: 191214046
num_examples: 28404
download_size: 190908742
dataset_size: 191214046
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nielsr/datacomp_small_llamav2_classified_v3 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: uid
dtype: string
- name: url
dtype: string
- name: text
dtype: string
- name: original_width
dtype: int64
- name: original_height
dtype: int64
- name: clip_b32_similarity_score
dtype: float32
- name: clip_l14_similarity_score
dtype: float32
- name: face_bboxes
sequence:
sequence: float64
- name: sha256
dtype: string
- name: detected_language
dtype: string
- name: label
dtype:
class_label:
names:
'0': bad
'1': good
splits:
- name: train
num_bytes: 16878987
num_examples: 50000
download_size: 12939770
dataset_size: 16878987
---
# Dataset Card for "datacomp_small_llamav2_classified_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_86 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 25978678848.5
num_examples: 270476
download_size: 23602836628
dataset_size: 25978678848.5
---
# Dataset Card for "chunk_86"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cubpaw/voxelgym_3c_200 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
- name: rgb_label
dtype: image
splits:
- name: train
num_bytes: 342576.0
num_examples: 200
- name: validation
num_bytes: 86676.0
num_examples: 50
download_size: 277351
dataset_size: 429252.0
---
# Dataset Card for "voxelgym_3c_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-36000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1031049
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mask-distilled-one-sec-cv12/chunk_98 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1247081720
num_examples: 244910
download_size: 1271966974
dataset_size: 1247081720
---
# Dataset Card for "chunk_98"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EnD-Diffusers/AI_Faces | ---
license: creativeml-openrail-m
task_categories:
- text-to-image
language:
- en
tags:
- AI Faces
- Photography
- Stable Diffusion
pretty_name: AI Generated Faces
size_categories:
- 1K<n<10K
--- |
TimoImhof/TriviaQA-in-SQuAD-format | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: context
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: unmodified
num_bytes: 22886661
num_examples: 15368
- name: modified_30_percent
num_bytes: 22899894
num_examples: 15368
- name: modified_100_percent
num_bytes: 22929228
num_examples: 15368
download_size: 40760032
dataset_size: 68715783
---
# Dataset Card for "TriviaQA-in-SQuAD-format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HDBrinkmann/HDBTEST4PLAN01 | ---
license: apache-2.0
language:
- de
--- |
liuyanchen1015/MULTI_VALUE_rte_completive_have_done | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 487359
num_examples: 1156
- name: train
num_bytes: 426177
num_examples: 962
download_size: 583084
dataset_size: 913536
---
# Dataset Card for "MULTI_VALUE_rte_completive_have_done"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MatrixStudio/Codeforces-Python-Submissions-SFT | ---
dataset_info:
features:
- name: contestId
dtype: int64
- name: index
dtype: string
- name: name
dtype: string
- name: type
dtype: string
- name: rating
dtype: int64
- name: tags
sequence: string
- name: title
dtype: string
- name: time-limit
dtype: string
- name: memory-limit
dtype: string
- name: problem-description
dtype: string
- name: input-specification
dtype: string
- name: output-specification
dtype: string
- name: demo-input
sequence: string
- name: demo-output
sequence: string
- name: note
dtype: string
- name: points
dtype: float64
- name: test_cases
list:
- name: input
dtype: string
- name: output
dtype: string
- name: creationTimeSeconds
dtype: int64
- name: relativeTimeSeconds
dtype: int64
- name: programmingLanguage
dtype: string
- name: verdict
dtype: string
- name: testset
dtype: string
- name: passedTestCount
dtype: int64
- name: timeConsumedMillis
dtype: int64
- name: memoryConsumedBytes
dtype: int64
- name: code
dtype: string
- name: prompt
dtype: string
- name: response
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 300024850.1601567
num_examples: 44097
- name: test
num_bytes: 32217909.425747305
num_examples: 4735
download_size: 27700824
dataset_size: 332242759.585904
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Multimodal-Fatima/VQAv2_sample_validation_facebook_opt_6.7b_mode_VQAv2_visclues_detection_ns_10_open_ended | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_bs_8
num_bytes: 1619
num_examples: 10
download_size: 2691
dataset_size: 1619
---
# Dataset Card for "VQAv2_sample_validation_facebook_opt_6.7b_mode_VQAv2_visclues_detection_ns_10_open_ended"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/pmc_v2 | ---
pretty_name: '`pmc/v2`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `pmc/v2`
The `pmc/v2` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/pmc#pmc/v2).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=1,255,260
This dataset is used by: [`pmc_v2_trec-cds-2016`](https://huggingface.co/datasets/irds/pmc_v2_trec-cds-2016)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/pmc_v2', 'docs')
for record in docs:
record # {'doc_id': ..., 'journal': ..., 'title': ..., 'abstract': ..., 'body': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
pasindu/COCO_20p | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 18109769383.492
num_examples: 113209
download_size: 3858741055
dataset_size: 18109769383.492
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
polinaeterna/test_push_dataset_dict_update_dataset_info | ---
dataset_info:
- config_name: default
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 1120
num_examples: 70
- name: random
num_bytes: 800
num_examples: 50
download_size: 3746
dataset_size: 1920
- config_name: v1
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 1280
num_examples: 80
- name: random
num_bytes: 960
num_examples: 60
download_size: 3932
dataset_size: 2240
configs_kwargs:
- config_name: default
data_dir: ./
- config_name: v1
data_dir: v1
---
# Dataset Card for "test_push_dataset_dict_update_dataset_info"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cryscan/multilingual-share | ---
license: cc0-1.0
language:
- en
- zh
---
# Multilingual Share GPT
Multilingual Share GPT, the free multi-language corpus for LLM training. All text are converted to markdown format, and classified by languages.
## Github Repo
Follow the [link](https://github.com/RWKV-Wiki/MultilingualShareGPT) here to Github.
## Data Example
```json
{
"id": "ImiMfCY",
"lang": "en",
"text": "\n<|user|>: Let's play chess\n\n<|bot|>: Sure, I'd love to play chess with you! Do you want to play a virtual game or just ask chess-related questions?\n\n<|user|>: Virtual game\n\n<|bot|>: I'm sorry, I'm an AI language model and I don't have the ability to display a chess board or move pieces on it. However, I'd be happy to help you with any chess-related questions or problems you might have. Just let me know what you need help with!\n\n<|user|>: Let's play\n\n<|bot|>: I apologize for any confusion, but as an AI language model I do not have the ability to play chess as it requires a graphical interface. However, I can help you learn and improve your chess skills by answering questions and providing explanations and analysis of specific positions or games.\n\n<|user|>: 1.e4\n\n<|bot|>: The move 1.e4 is the most common first move in the game of chess, and is known as the \"King's Pawn Opening\". This move aims to control the center of the board and create space for the other pieces to develop. Black has a variety of responses to 1.e4, but some of the most common are 1...e5 (the \"King's Gambit\"), 1...c5 (the \"Sicilian Defense\"), and 1...e6 (the \"French Defense\").\n\n<|user|>: I play 1.e4\n\n<|bot|>: Very good! You have started the game with the King's Pawn Opening. As black, I will respond with 1...e5, the King's Gambit.\n\n<|user|>: I play 2.Nf3\n\n<|bot|>: That's a good move. With 2.Nf3, you are developing your knight and attacking the black pawn on e5. As black, I'll play 2...Nc6, which develops the knight to a strong central square and prepares to castle kingside.\n\n<|user|>: I play 3.d4\n\n<|bot|>: With the move 3.d4, you are now offering a pawn to black with the King's Gambit. This gambit can lead to a sharp and tactical game, but it also weakens the white pawn structure and can lead to a disadvantage if not handled properly. As black, I will accept the gambit with 3...exd4.\n\n<|user|>: I play 4.Nxd4\n\n<|bot|>: With 4.Nxd4, you are now attacking the black pawn on d4 and forcing it to capture. As black, I'll capture the pawn with 4...Nxd4. This has now opened up the d-file for my rook and given me a lead in development.\n\n<|user|>: I play 5.Qxd4\n\n<|bot|>: With the move 5.Qxd4, you have taken control of the d-file and are attacking the black knight on d4. As black, I will respond with 5...Nc6, developing my knight to a safer square and putting pressure on the white queen.\n\n<|user|>: I play 6.Qd3\n\n<|bot|>: With 6.Qd3, you are attacking the black knight again and preparing to castle kingside. This move also has the idea of putting pressure on the d7-pawn, as well as the d8-h4 diagonal. As black, I will respond with 6...Nf6, developing my knight to a strong central square and preparing to castle kingside.\n\n<|user|>: I play 7.Nc3\n\n<|bot|>: With 7.Nc3, you are developing your knight and preparing to castle kingside. This move also has the idea of supporting the d4-queen and putting pressure on the e5-pawn. As black, I will play 7...d6, solidifying my pawn structure and preparing to develop my dark-squared bishop.\n\n<|user|>: I play 8.Bg5\n\n<|bot|>: With 8.Bg5, you are attacking the black knight on f6 and preparing to castle kingside. This move also has the idea of putting pressure on the h7-pawn and controlling the f4-square. As black, I will respond with 8...Be7, developing my dark-squared bishop and preparing to castle kingside.\n\n<|user|>: I play 9.0-0-0\n\n<|bot|>: With 9.0-0-0, you are now castling kingside and putting your king in a safer position. This move also has the idea of bringing the rook into the game more quickly. As black, I will castle kingside as well with 9...0-0-0.\n\n<|user|>: I play 10.Kb1\n\n<|bot|>: I'm sorry, but that is an illegal move. In chess, the king can only move one square at a time and cannot \"jump\" over other pieces. The correct move would be 10.Be2, developing your bishop and preparing to castle kingside.\n\n<|endoftext|>"
}
```
## 中文用户请看这里
为了推进中文AI的发展,促进AI技术公开化、国际化,我们成立了 ShareGPT-90k 项目,希望借助大家的力量推进数据清洗与对齐工作。
可能与各位想象的有所不同,GPT模型主要通过预训练数据集赋能,语料的质量对模型最终性能至关重要。然而,百度知道、CSDN、知乎等平台软文过多;小木虫等高质量平台语料过少;个人博客内容质量参差不齐。
OpenAI完成数据集的收集花费了巨大成本,以至于需要从微软集资。我们无力承担如此巨大的开销,于是需要各位有志于筹建开放获取语料,并有一定外语基础的网友们献上自己的力量。
如果您有意向参与此项目,我们将不胜感激。
项目地址:https://paratranz.cn/projects/6725 |
KaiserML/SemanticScholar_all-distilroberta-v1_Embeddings | ---
dataset_info:
features:
- name: corpusid
dtype: int64
- name: openaccessinfo
struct:
- name: externalids
struct:
- name: ACL
dtype: string
- name: ArXiv
dtype: string
- name: DOI
dtype: string
- name: MAG
dtype: string
- name: PubMedCentral
dtype: string
- name: license
dtype: string
- name: status
dtype: string
- name: url
dtype: string
- name: abstract
dtype: string
- name: updated
dtype: string
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 709059031
num_examples: 440000
download_size: 552056679
dataset_size: 709059031
---
# Dataset Card for "SemanticScholar_all-distilroberta-v1_Embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tony-Yuan/chemistry-assistant | ---
license: apache-2.0
---
|
FINNUMBER/FINCH_TRAIN_All_900_per100 | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3789521
num_examples: 900
download_size: 2074549
dataset_size: 3789521
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/purple_heart_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of purple_heart/パープルハート/绀紫之心 (Azur Lane)
This is the dataset of purple_heart/パープルハート/绀紫之心 (Azur Lane), containing 500 images and their tags.
The core tags of this character are `purple_hair, long_hair, breasts, very_long_hair, blue_eyes, braid, symbol-shaped_pupils, twin_braids, hair_ornament, hair_between_eyes, medium_breasts, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 826.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/purple_heart_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 435.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/purple_heart_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1300 | 965.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/purple_heart_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 717.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/purple_heart_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1300 | 1.40 GiB | [Download](https://huggingface.co/datasets/CyberHarem/purple_heart_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/purple_heart_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, blush, leotard, solo, looking_at_viewer, gloves, cleavage_cutout, open_mouth, thighhighs |
| 1 | 19 |  |  |  |  |  | 1girl, leotard, looking_at_viewer, power_symbol, solo, covered_navel, cleavage_cutout, bangs, smile, blush, gloves, cowboy_shot, simple_background |
| 2 | 14 |  |  |  |  |  | 1girl, bangs, cleavage_cutout, gloves, leotard, looking_at_viewer, power_symbol, solo, neon_trim, black_thighhighs, floating_hair, emblem, standing, turtleneck, blush, cowboy_shot, holding_sword, magical_girl, serious, legs_apart, covered_navel, glowing, wings, headgear |
| 3 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, power_symbol, solo, bodysuit, skin_tight, blush, covered_navel, sword |
| 4 | 6 |  |  |  |  |  | 1girl, black_bikini, blush, looking_at_viewer, power_symbol, solo, cleavage, wet, water, navel, smile, twintails |
| 5 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, navel, solo, black_bikini, cleavage, smile, side-tie_bikini_bottom, hair_flower, water |
| 6 | 7 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, looking_at_viewer, power_symbol, solo, black_dress, blush, elbow_gloves, hair_flower, smile, black_gloves, upper_body, earrings |
| 7 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, navel, nipples, power_symbol, solo, completely_nude, smile, pussy |
| 8 | 5 |  |  |  |  |  | 2girls, bangs, cleavage_cutout, covered_navel, gloves, leotard, looking_at_viewer, magical_girl, power_symbol, solo_focus, cowboy_shot, sidelocks, cloud, day, from_side, outstretched_arm, sky, thighhighs, turtleneck, d-pad_hair_ornament, hood, light_smile |
| 9 | 13 |  |  |  |  |  | 1girl, feathered_wings, halo, solo, angel_wings, bare_shoulders, hair_flower, power_symbol, cleavage, looking_at_viewer, blush, thighhighs, smile, elbow_gloves, leotard |
| 10 | 5 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, elbow_gloves, looking_at_viewer, navel, power_symbol, solo, white_gloves, blush, cowboy_shot, revealing_clothes, white_thighhighs, smile, twintails, wings |
| 11 | 12 |  |  |  |  |  | bare_shoulders, blush, looking_at_viewer, playboy_bunny, power_symbol, rabbit_ears, 1girl, detached_collar, pantyhose, wrist_cuffs, cleavage, solo, black_leotard, covered_navel, smile, fake_animal_ears, fishnets, twintails, rabbit_tail, bow |
| 12 | 18 |  |  |  |  |  | 1boy, 1girl, hetero, penis, solo_focus, blush, nipples, smile, mosaic_censoring, looking_at_viewer, open_mouth, paizuri, d-pad_hair_ornament, nude, sweat, cum, power_symbol |
| 13 | 5 |  |  |  |  |  | 1girl, alternate_breast_size, blush, completely_nude, huge_breasts, looking_at_viewer, looking_back, nipples, outdoors, shiny_skin, solo, huge_ass, bangs, curvy, day, from_behind, open_mouth, thick_thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | leotard | solo | looking_at_viewer | gloves | cleavage_cutout | open_mouth | thighhighs | power_symbol | covered_navel | bangs | smile | cowboy_shot | simple_background | neon_trim | black_thighhighs | floating_hair | emblem | standing | turtleneck | holding_sword | magical_girl | serious | legs_apart | glowing | wings | headgear | bodysuit | skin_tight | sword | black_bikini | cleavage | wet | water | navel | twintails | side-tie_bikini_bottom | hair_flower | bare_shoulders | black_dress | elbow_gloves | black_gloves | upper_body | earrings | nipples | completely_nude | pussy | 2girls | solo_focus | sidelocks | cloud | day | from_side | outstretched_arm | sky | d-pad_hair_ornament | hood | light_smile | feathered_wings | halo | angel_wings | white_gloves | revealing_clothes | white_thighhighs | playboy_bunny | rabbit_ears | detached_collar | pantyhose | wrist_cuffs | black_leotard | fake_animal_ears | fishnets | rabbit_tail | bow | 1boy | hetero | penis | mosaic_censoring | paizuri | nude | sweat | cum | alternate_breast_size | huge_breasts | looking_back | outdoors | shiny_skin | huge_ass | curvy | from_behind | thick_thighs |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:----------|:-------|:--------------------|:---------|:------------------|:-------------|:-------------|:---------------|:----------------|:--------|:--------|:--------------|:--------------------|:------------|:-------------------|:----------------|:---------|:-----------|:-------------|:----------------|:---------------|:----------|:-------------|:----------|:--------|:-----------|:-----------|:-------------|:--------|:---------------|:-----------|:------|:--------|:--------|:------------|:-------------------------|:--------------|:-----------------|:--------------|:---------------|:---------------|:-------------|:-----------|:----------|:------------------|:--------|:---------|:-------------|:------------|:--------|:------|:------------|:-------------------|:------|:----------------------|:-------|:--------------|:------------------|:-------|:--------------|:---------------|:--------------------|:-------------------|:----------------|:--------------|:------------------|:------------|:--------------|:----------------|:-------------------|:-----------|:--------------|:------|:-------|:---------|:--------|:-------------------|:----------|:-------|:--------|:------|:------------------------|:---------------|:---------------|:-----------|:-------------|:-----------|:--------|:--------------|:---------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | | | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | | X | X | | | | | X | X | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | | X | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | X | | X | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | X | | X | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | | | X | | X | X | X | | X | X | X | X | | X | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 13 |  |  |  |  |  | X | X | X | X | X | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | | X | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 5 |  |  |  |  |  | X | X | | X | X | | | | | X | | | X | X | | | | | | | | | | | | | X | | | | | | X | | | X | X | | | X | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 12 |  |  |  |  |  | X | X | | X | X | | | | | X | X | | X | | | | | | | | | | | | | | | | | | | | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 12 | 18 |  |  |  |  |  | X | X | | | X | | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | |
| 13 | 5 |  |  |  |  |  | X | X | | X | X | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
liuyanchen1015/MULTI_VALUE_rte_clause_final_really_but | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: string
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: train
num_bytes: 702
num_examples: 1
download_size: 0
dataset_size: 702
---
# Dataset Card for "MULTI_VALUE_rte_clause_final_really_but"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
imhabii/test | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 51289671.0
num_examples: 1
download_size: 45155689
dataset_size: 51289671.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yonischeyer/promptPlusInput | ---
license: unknown
---
|
joey234/mmlu-college_computer_science-original-neg | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 12788.7
num_examples: 30
download_size: 12204
dataset_size: 12788.7
---
# Dataset Card for "mmlu-college_computer_science-original-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tasksource/context_toxicity | ---
license: apache-2.0
---
https://github.com/ipavlopoulos/context_toxicity/
```
@inproceedings{xenos-etal-2021-context,
title = "Context Sensitivity Estimation in Toxicity Detection",
author = "Xenos, Alexandros and
Pavlopoulos, John and
Androutsopoulos, Ion",
booktitle = "Proceedings of the 5th Workshop on Online Abuse and Harms (WOAH 2021)",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.woah-1.15",
doi = "10.18653/v1/2021.woah-1.15",
pages = "140--145",
abstract = "User posts whose perceived toxicity depends on the conversational context are rare in current toxicity detection datasets. Hence, toxicity detectors trained on current datasets will also disregard context, making the detection of context-sensitive toxicity a lot harder when it occurs. We constructed and publicly release a dataset of 10k posts with two kinds of toxicity labels per post, obtained from annotators who considered (i) both the current post and the previous one as context, or (ii) only the current post. We introduce a new task, context-sensitivity estimation, which aims to identify posts whose perceived toxicity changes if the context (previous post) is also considered. Using the new dataset, we show that systems can be developed for this task. Such systems could be used to enhance toxicity detection datasets with more context-dependent posts or to suggest when moderators should consider the parent posts, which may not always be necessary and may introduce additional costs.",
}
``` |
carles-undergrad-thesis/mmarco-hardnegs-bm25 | ---
dataset_info:
features:
- name: qid
dtype: string
- name: pos
sequence: string
- name: neg
sequence: string
splits:
- name: train
num_bytes: 73918683
num_examples: 532751
download_size: 52012395
dataset_size: 73918683
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-squad-plain_text-fee91a-2282172274 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad
eval_info:
task: extractive_question_answering
model: Aiyshwariya/bert-finetuned-squad
metrics: ['squad', 'squad_v2']
dataset_name: squad
dataset_config: plain_text
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: Aiyshwariya/bert-finetuned-squad
* Dataset: squad
* Config: plain_text
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@bestuh](https://huggingface.co/bestuh) for evaluating this model. |
HamdanXI/arb-eng-parallel-10k-splitted-translated-arabic | ---
dataset_info:
features:
- name: arabic
dtype: string
- name: english
dtype: string
- name: translated
dtype: string
splits:
- name: train
num_bytes: 4714807
num_examples: 7999
- name: validation
num_bytes: 571638
num_examples: 1000
- name: test
num_bytes: 585646
num_examples: 1000
download_size: 3399538
dataset_size: 5872091
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Des1gn-1/at1 | ---
license: openrail
---
|
denizzhansahin/Turkish_News_News-2-2024 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: Baslik
dtype: string
- name: Ozet
dtype: string
- name: Kategori
dtype: string
- name: Link
dtype: string
- name: Icerik
dtype: string
splits:
- name: train
num_bytes: 9631059.992609017
num_examples: 4735
- name: validation
num_bytes: 4129050.007390983
num_examples: 2030
download_size: 7797770
dataset_size: 13760110.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Goader/ukrainian-treebank-lm | ---
license: cc-by-nc-sa-4.0
task_categories:
- fill-mask
- text-generation
language:
- uk
pretty_name: Ukrainian Treebank (Language Modeling)
--- |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-99000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 668887
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Muthuchancoach/Technology_GarageQA | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 39367
num_examples: 177
download_size: 7655
dataset_size: 39367
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Technology_GarageQA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chathuranga-jayanath/context-5-predict-token-for-fine-tune-without-comments-from-times4j | ---
dataset_info:
features:
- name: id
dtype: int64
- name: filepath
dtype: string
- name: start_bug_line
dtype: int64
- name: end_bug_line
dtype: int64
- name: bug
dtype: string
- name: fix
dtype: string
- name: ctx
dtype: string
splits:
- name: train
num_bytes: 80978808
num_examples: 134751
- name: validation
num_bytes: 9645229
num_examples: 16843
- name: test
num_bytes: 9816295
num_examples: 16843
download_size: 11275982
dataset_size: 100440332
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
SkyWR/SkyWR2 | ---
license: openrail
---
|
open-llm-leaderboard/details_LLM360__AmberChat | ---
pretty_name: Evaluation run of LLM360/AmberChat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LLM360/AmberChat](https://huggingface.co/LLM360/AmberChat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LLM360__AmberChat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T00:43:50.737456](https://huggingface.co/datasets/open-llm-leaderboard/details_LLM360__AmberChat/blob/main/results_2024-03-01T00-43-50.737456.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38959952903381356,\n\
\ \"acc_stderr\": 0.0341361554393818,\n \"acc_norm\": 0.3942078227351397,\n\
\ \"acc_norm_stderr\": 0.03495443087227231,\n \"mc1\": 0.2692778457772338,\n\
\ \"mc1_stderr\": 0.015528566637087281,\n \"mc2\": 0.41182368750935106,\n\
\ \"mc2_stderr\": 0.01458048423160228\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.40102389078498296,\n \"acc_stderr\": 0.01432225579071987,\n\
\ \"acc_norm\": 0.42918088737201365,\n \"acc_norm_stderr\": 0.014464085894870655\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5540728938458475,\n\
\ \"acc_stderr\": 0.004960516570284905,\n \"acc_norm\": 0.7400916152160925,\n\
\ \"acc_norm_stderr\": 0.0043768776192341175\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779205,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779205\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.43018867924528303,\n \"acc_stderr\": 0.030471445867183238,\n\
\ \"acc_norm\": 0.43018867924528303,\n \"acc_norm_stderr\": 0.030471445867183238\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3611111111111111,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.3611111111111111,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.36416184971098264,\n\
\ \"acc_stderr\": 0.036690724774169084,\n \"acc_norm\": 0.36416184971098264,\n\
\ \"acc_norm_stderr\": 0.036690724774169084\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.1568627450980392,\n \"acc_stderr\": 0.036186648199362466,\n\
\ \"acc_norm\": 0.1568627450980392,\n \"acc_norm_stderr\": 0.036186648199362466\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3659574468085106,\n \"acc_stderr\": 0.0314895582974553,\n\
\ \"acc_norm\": 0.3659574468085106,\n \"acc_norm_stderr\": 0.0314895582974553\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147127,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147127\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4064516129032258,\n\
\ \"acc_stderr\": 0.027941727346256315,\n \"acc_norm\": 0.4064516129032258,\n\
\ \"acc_norm_stderr\": 0.027941727346256315\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4121212121212121,\n \"acc_stderr\": 0.03843566993588718,\n\
\ \"acc_norm\": 0.4121212121212121,\n \"acc_norm_stderr\": 0.03843566993588718\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4393939393939394,\n \"acc_stderr\": 0.0353608594752948,\n \"acc_norm\"\
: 0.4393939393939394,\n \"acc_norm_stderr\": 0.0353608594752948\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.47668393782383417,\n \"acc_stderr\": 0.03604513672442207,\n\
\ \"acc_norm\": 0.47668393782383417,\n \"acc_norm_stderr\": 0.03604513672442207\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.024035489676335075,\n\
\ \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.024035489676335075\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.030388353551886838,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.030388353551886838\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.03479185572599661,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.03479185572599661\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5009174311926605,\n \"acc_stderr\": 0.021437287056051208,\n \"\
acc_norm\": 0.5009174311926605,\n \"acc_norm_stderr\": 0.021437287056051208\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.03038805130167812,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.03038805130167812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.39705882352941174,\n \"acc_stderr\": 0.03434131164719129,\n \"\
acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.03434131164719129\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.42616033755274263,\n \"acc_stderr\": 0.03219035703131774,\n \
\ \"acc_norm\": 0.42616033755274263,\n \"acc_norm_stderr\": 0.03219035703131774\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.45739910313901344,\n\
\ \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.45739910313901344,\n\
\ \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.42748091603053434,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.42748091603053434,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.47107438016528924,\n \"acc_stderr\": 0.04556710331269498,\n \"\
acc_norm\": 0.47107438016528924,\n \"acc_norm_stderr\": 0.04556710331269498\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.42718446601941745,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.42718446601941745,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5598290598290598,\n\
\ \"acc_stderr\": 0.032520741720630506,\n \"acc_norm\": 0.5598290598290598,\n\
\ \"acc_norm_stderr\": 0.032520741720630506\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5172413793103449,\n\
\ \"acc_stderr\": 0.017869330154003705,\n \"acc_norm\": 0.5172413793103449,\n\
\ \"acc_norm_stderr\": 0.017869330154003705\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3786127167630058,\n \"acc_stderr\": 0.02611374936131034,\n\
\ \"acc_norm\": 0.3786127167630058,\n \"acc_norm_stderr\": 0.02611374936131034\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n\
\ \"acc_stderr\": 0.014756906483260664,\n \"acc_norm\": 0.264804469273743,\n\
\ \"acc_norm_stderr\": 0.014756906483260664\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.477124183006536,\n \"acc_stderr\": 0.028599936776089786,\n\
\ \"acc_norm\": 0.477124183006536,\n \"acc_norm_stderr\": 0.028599936776089786\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3954983922829582,\n\
\ \"acc_stderr\": 0.027770918531427834,\n \"acc_norm\": 0.3954983922829582,\n\
\ \"acc_norm_stderr\": 0.027770918531427834\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.45987654320987653,\n \"acc_stderr\": 0.027731022753539277,\n\
\ \"acc_norm\": 0.45987654320987653,\n \"acc_norm_stderr\": 0.027731022753539277\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.31560283687943264,\n \"acc_stderr\": 0.027724989449509314,\n \
\ \"acc_norm\": 0.31560283687943264,\n \"acc_norm_stderr\": 0.027724989449509314\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.29791395045632335,\n\
\ \"acc_stderr\": 0.011680717340400042,\n \"acc_norm\": 0.29791395045632335,\n\
\ \"acc_norm_stderr\": 0.011680717340400042\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.02952009569768776,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.02952009569768776\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.36437908496732024,\n \"acc_stderr\": 0.019469518221573702,\n \
\ \"acc_norm\": 0.36437908496732024,\n \"acc_norm_stderr\": 0.019469518221573702\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.41818181818181815,\n\
\ \"acc_stderr\": 0.0472457740573157,\n \"acc_norm\": 0.41818181818181815,\n\
\ \"acc_norm_stderr\": 0.0472457740573157\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03168091161233882,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03168091161233882\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5223880597014925,\n\
\ \"acc_stderr\": 0.03531987930208731,\n \"acc_norm\": 0.5223880597014925,\n\
\ \"acc_norm_stderr\": 0.03531987930208731\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.52046783625731,\n \"acc_stderr\": 0.0383161053282193,\n\
\ \"acc_norm\": 0.52046783625731,\n \"acc_norm_stderr\": 0.0383161053282193\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2692778457772338,\n\
\ \"mc1_stderr\": 0.015528566637087281,\n \"mc2\": 0.41182368750935106,\n\
\ \"mc2_stderr\": 0.01458048423160228\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6661404893449092,\n \"acc_stderr\": 0.013254029695143348\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05534495830174375,\n \
\ \"acc_stderr\": 0.0062982217961795785\n }\n}\n```"
repo_url: https://huggingface.co/LLM360/AmberChat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|arc:challenge|25_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|gsm8k|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hellaswag|10_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-43-50.737456.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T00-43-50.737456.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- '**/details_harness|winogrande|5_2024-03-01T00-43-50.737456.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T00-43-50.737456.parquet'
- config_name: results
data_files:
- split: 2024_03_01T00_43_50.737456
path:
- results_2024-03-01T00-43-50.737456.parquet
- split: latest
path:
- results_2024-03-01T00-43-50.737456.parquet
---
# Dataset Card for Evaluation run of LLM360/AmberChat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LLM360/AmberChat](https://huggingface.co/LLM360/AmberChat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LLM360__AmberChat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T00:43:50.737456](https://huggingface.co/datasets/open-llm-leaderboard/details_LLM360__AmberChat/blob/main/results_2024-03-01T00-43-50.737456.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.38959952903381356,
"acc_stderr": 0.0341361554393818,
"acc_norm": 0.3942078227351397,
"acc_norm_stderr": 0.03495443087227231,
"mc1": 0.2692778457772338,
"mc1_stderr": 0.015528566637087281,
"mc2": 0.41182368750935106,
"mc2_stderr": 0.01458048423160228
},
"harness|arc:challenge|25": {
"acc": 0.40102389078498296,
"acc_stderr": 0.01432225579071987,
"acc_norm": 0.42918088737201365,
"acc_norm_stderr": 0.014464085894870655
},
"harness|hellaswag|10": {
"acc": 0.5540728938458475,
"acc_stderr": 0.004960516570284905,
"acc_norm": 0.7400916152160925,
"acc_norm_stderr": 0.0043768776192341175
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779205,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.43018867924528303,
"acc_stderr": 0.030471445867183238,
"acc_norm": 0.43018867924528303,
"acc_norm_stderr": 0.030471445867183238
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.36416184971098264,
"acc_stderr": 0.036690724774169084,
"acc_norm": 0.36416184971098264,
"acc_norm_stderr": 0.036690724774169084
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.1568627450980392,
"acc_stderr": 0.036186648199362466,
"acc_norm": 0.1568627450980392,
"acc_norm_stderr": 0.036186648199362466
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3659574468085106,
"acc_stderr": 0.0314895582974553,
"acc_norm": 0.3659574468085106,
"acc_norm_stderr": 0.0314895582974553
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.0433913832257986,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.0433913832257986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147127,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147127
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4064516129032258,
"acc_stderr": 0.027941727346256315,
"acc_norm": 0.4064516129032258,
"acc_norm_stderr": 0.027941727346256315
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4121212121212121,
"acc_stderr": 0.03843566993588718,
"acc_norm": 0.4121212121212121,
"acc_norm_stderr": 0.03843566993588718
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4393939393939394,
"acc_stderr": 0.0353608594752948,
"acc_norm": 0.4393939393939394,
"acc_norm_stderr": 0.0353608594752948
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.47668393782383417,
"acc_stderr": 0.03604513672442207,
"acc_norm": 0.47668393782383417,
"acc_norm_stderr": 0.03604513672442207
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.024035489676335075,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.024035489676335075
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.030388353551886838,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.030388353551886838
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.03479185572599661,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.03479185572599661
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5009174311926605,
"acc_stderr": 0.021437287056051208,
"acc_norm": 0.5009174311926605,
"acc_norm_stderr": 0.021437287056051208
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.03038805130167812,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.03038805130167812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.03434131164719129,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.03434131164719129
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.42616033755274263,
"acc_stderr": 0.03219035703131774,
"acc_norm": 0.42616033755274263,
"acc_norm_stderr": 0.03219035703131774
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.45739910313901344,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.45739910313901344,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.42748091603053434,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.42748091603053434,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.47107438016528924,
"acc_stderr": 0.04556710331269498,
"acc_norm": 0.47107438016528924,
"acc_norm_stderr": 0.04556710331269498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04803752235190193,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04803752235190193
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.42718446601941745,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.42718446601941745,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5598290598290598,
"acc_stderr": 0.032520741720630506,
"acc_norm": 0.5598290598290598,
"acc_norm_stderr": 0.032520741720630506
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.017869330154003705,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.017869330154003705
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3786127167630058,
"acc_stderr": 0.02611374936131034,
"acc_norm": 0.3786127167630058,
"acc_norm_stderr": 0.02611374936131034
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.014756906483260664,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.014756906483260664
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.477124183006536,
"acc_stderr": 0.028599936776089786,
"acc_norm": 0.477124183006536,
"acc_norm_stderr": 0.028599936776089786
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3954983922829582,
"acc_stderr": 0.027770918531427834,
"acc_norm": 0.3954983922829582,
"acc_norm_stderr": 0.027770918531427834
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.45987654320987653,
"acc_stderr": 0.027731022753539277,
"acc_norm": 0.45987654320987653,
"acc_norm_stderr": 0.027731022753539277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.31560283687943264,
"acc_stderr": 0.027724989449509314,
"acc_norm": 0.31560283687943264,
"acc_norm_stderr": 0.027724989449509314
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.29791395045632335,
"acc_stderr": 0.011680717340400042,
"acc_norm": 0.29791395045632335,
"acc_norm_stderr": 0.011680717340400042
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.36437908496732024,
"acc_stderr": 0.019469518221573702,
"acc_norm": 0.36437908496732024,
"acc_norm_stderr": 0.019469518221573702
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.0472457740573157,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.0472457740573157
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03168091161233882,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03168091161233882
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5223880597014925,
"acc_stderr": 0.03531987930208731,
"acc_norm": 0.5223880597014925,
"acc_norm_stderr": 0.03531987930208731
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.52046783625731,
"acc_stderr": 0.0383161053282193,
"acc_norm": 0.52046783625731,
"acc_norm_stderr": 0.0383161053282193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2692778457772338,
"mc1_stderr": 0.015528566637087281,
"mc2": 0.41182368750935106,
"mc2_stderr": 0.01458048423160228
},
"harness|winogrande|5": {
"acc": 0.6661404893449092,
"acc_stderr": 0.013254029695143348
},
"harness|gsm8k|5": {
"acc": 0.05534495830174375,
"acc_stderr": 0.0062982217961795785
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
PK03/Orca-filtered | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 2074578643.6622322
num_examples: 1216347
download_size: 1515594488
dataset_size: 2074578643.6622322
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_abhinand__tamil-llama-7b-base-v0.1 | ---
pretty_name: Evaluation run of abhinand/tamil-llama-7b-base-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abhinand/tamil-llama-7b-base-v0.1](https://huggingface.co/abhinand/tamil-llama-7b-base-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhinand__tamil-llama-7b-base-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-13T21:11:05.655102](https://huggingface.co/datasets/open-llm-leaderboard/details_abhinand__tamil-llama-7b-base-v0.1/blob/main/results_2023-12-13T21-11-05.655102.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.41015709547468715,\n\
\ \"acc_stderr\": 0.034474662187784014,\n \"acc_norm\": 0.41587829741735977,\n\
\ \"acc_norm_stderr\": 0.03541855748648199,\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023498,\n \"mc2\": 0.3592775546075334,\n\
\ \"mc2_stderr\": 0.013858573967213928\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4402730375426621,\n \"acc_stderr\": 0.014506769524804236,\n\
\ \"acc_norm\": 0.4667235494880546,\n \"acc_norm_stderr\": 0.014578995859605806\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5305715992830113,\n\
\ \"acc_stderr\": 0.004980445551991267,\n \"acc_norm\": 0.7285401314479187,\n\
\ \"acc_norm_stderr\": 0.0044380385833450945\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.375,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4188679245283019,\n \"acc_stderr\": 0.030365050829115205,\n\
\ \"acc_norm\": 0.4188679245283019,\n \"acc_norm_stderr\": 0.030365050829115205\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3402777777777778,\n\
\ \"acc_stderr\": 0.03962135573486219,\n \"acc_norm\": 0.3402777777777778,\n\
\ \"acc_norm_stderr\": 0.03962135573486219\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4046242774566474,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.4046242774566474,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793254,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793254\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23544973544973544,\n \"acc_stderr\": 0.021851509822031722,\n \"\
acc_norm\": 0.23544973544973544,\n \"acc_norm_stderr\": 0.021851509822031722\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.49032258064516127,\n\
\ \"acc_stderr\": 0.028438677998909558,\n \"acc_norm\": 0.49032258064516127,\n\
\ \"acc_norm_stderr\": 0.028438677998909558\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n\
\ \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4121212121212121,\n \"acc_stderr\": 0.03843566993588717,\n\
\ \"acc_norm\": 0.4121212121212121,\n \"acc_norm_stderr\": 0.03843566993588717\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4494949494949495,\n \"acc_stderr\": 0.0354413249194797,\n \"acc_norm\"\
: 0.4494949494949495,\n \"acc_norm_stderr\": 0.0354413249194797\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.5544041450777202,\n \"acc_stderr\": 0.03587014986075659,\n\
\ \"acc_norm\": 0.5544041450777202,\n \"acc_norm_stderr\": 0.03587014986075659\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4461538461538462,\n \"acc_stderr\": 0.025203571773028333,\n\
\ \"acc_norm\": 0.4461538461538462,\n \"acc_norm_stderr\": 0.025203571773028333\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.03191863374478465,\n\
\ \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.03191863374478465\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5266055045871559,\n \"acc_stderr\": 0.021406952688151577,\n \"\
acc_norm\": 0.5266055045871559,\n \"acc_norm_stderr\": 0.021406952688151577\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.0346022832723917,\n \"acc_norm\"\
: 0.4166666666666667,\n \"acc_norm_stderr\": 0.0346022832723917\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.459915611814346,\n \"acc_stderr\": 0.03244246810187913,\n \"acc_norm\"\
: 0.459915611814346,\n \"acc_norm_stderr\": 0.03244246810187913\n },\n\
\ \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4484304932735426,\n\
\ \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.4484304932735426,\n\
\ \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48854961832061067,\n \"acc_stderr\": 0.04384140024078016,\n\
\ \"acc_norm\": 0.48854961832061067,\n \"acc_norm_stderr\": 0.04384140024078016\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5041322314049587,\n \"acc_stderr\": 0.04564198767432754,\n \"\
acc_norm\": 0.5041322314049587,\n \"acc_norm_stderr\": 0.04564198767432754\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.42592592592592593,\n\
\ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.42592592592592593,\n\
\ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3374233128834356,\n \"acc_stderr\": 0.03714908409935575,\n\
\ \"acc_norm\": 0.3374233128834356,\n \"acc_norm_stderr\": 0.03714908409935575\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5533980582524272,\n \"acc_stderr\": 0.04922424153458933,\n\
\ \"acc_norm\": 0.5533980582524272,\n \"acc_norm_stderr\": 0.04922424153458933\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5341880341880342,\n\
\ \"acc_stderr\": 0.03267942734081228,\n \"acc_norm\": 0.5341880341880342,\n\
\ \"acc_norm_stderr\": 0.03267942734081228\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5108556832694764,\n\
\ \"acc_stderr\": 0.017875748840242407,\n \"acc_norm\": 0.5108556832694764,\n\
\ \"acc_norm_stderr\": 0.017875748840242407\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3959537572254335,\n \"acc_stderr\": 0.02632981334194624,\n\
\ \"acc_norm\": 0.3959537572254335,\n \"acc_norm_stderr\": 0.02632981334194624\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30837988826815643,\n\
\ \"acc_stderr\": 0.0154457169109989,\n \"acc_norm\": 0.30837988826815643,\n\
\ \"acc_norm_stderr\": 0.0154457169109989\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.42810457516339867,\n \"acc_stderr\": 0.028332397483664267,\n\
\ \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.028332397483664267\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4437299035369775,\n\
\ \"acc_stderr\": 0.02821768355665232,\n \"acc_norm\": 0.4437299035369775,\n\
\ \"acc_norm_stderr\": 0.02821768355665232\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4228395061728395,\n \"acc_stderr\": 0.027487472980871598,\n\
\ \"acc_norm\": 0.4228395061728395,\n \"acc_norm_stderr\": 0.027487472980871598\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.32978723404255317,\n \"acc_stderr\": 0.028045946942042398,\n \
\ \"acc_norm\": 0.32978723404255317,\n \"acc_norm_stderr\": 0.028045946942042398\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.30247718383311606,\n\
\ \"acc_stderr\": 0.011731524234165703,\n \"acc_norm\": 0.30247718383311606,\n\
\ \"acc_norm_stderr\": 0.011731524234165703\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.45588235294117646,\n \"acc_stderr\": 0.03025437257397669,\n\
\ \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.03025437257397669\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.32679738562091504,\n \"acc_stderr\": 0.018975427920507215,\n \
\ \"acc_norm\": 0.32679738562091504,\n \"acc_norm_stderr\": 0.018975427920507215\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n\
\ \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n\
\ \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4857142857142857,\n \"acc_stderr\": 0.03199615232806287,\n\
\ \"acc_norm\": 0.4857142857142857,\n \"acc_norm_stderr\": 0.03199615232806287\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5223880597014925,\n\
\ \"acc_stderr\": 0.035319879302087305,\n \"acc_norm\": 0.5223880597014925,\n\
\ \"acc_norm_stderr\": 0.035319879302087305\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3493975903614458,\n\
\ \"acc_stderr\": 0.03711725190740749,\n \"acc_norm\": 0.3493975903614458,\n\
\ \"acc_norm_stderr\": 0.03711725190740749\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.03733756969066165,\n\
\ \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.03733756969066165\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023498,\n \"mc2\": 0.3592775546075334,\n\
\ \"mc2_stderr\": 0.013858573967213928\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7071823204419889,\n \"acc_stderr\": 0.012789321118542613\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/abhinand/tamil-llama-7b-base-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|arc:challenge|25_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|gsm8k|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hellaswag|10_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T21-11-05.655102.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T21-11-05.655102.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- '**/details_harness|winogrande|5_2023-12-13T21-11-05.655102.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-13T21-11-05.655102.parquet'
- config_name: results
data_files:
- split: 2023_12_13T21_11_05.655102
path:
- results_2023-12-13T21-11-05.655102.parquet
- split: latest
path:
- results_2023-12-13T21-11-05.655102.parquet
---
# Dataset Card for Evaluation run of abhinand/tamil-llama-7b-base-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhinand/tamil-llama-7b-base-v0.1](https://huggingface.co/abhinand/tamil-llama-7b-base-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhinand__tamil-llama-7b-base-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T21:11:05.655102](https://huggingface.co/datasets/open-llm-leaderboard/details_abhinand__tamil-llama-7b-base-v0.1/blob/main/results_2023-12-13T21-11-05.655102.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.41015709547468715,
"acc_stderr": 0.034474662187784014,
"acc_norm": 0.41587829741735977,
"acc_norm_stderr": 0.03541855748648199,
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023498,
"mc2": 0.3592775546075334,
"mc2_stderr": 0.013858573967213928
},
"harness|arc:challenge|25": {
"acc": 0.4402730375426621,
"acc_stderr": 0.014506769524804236,
"acc_norm": 0.4667235494880546,
"acc_norm_stderr": 0.014578995859605806
},
"harness|hellaswag|10": {
"acc": 0.5305715992830113,
"acc_stderr": 0.004980445551991267,
"acc_norm": 0.7285401314479187,
"acc_norm_stderr": 0.0044380385833450945
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.375,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.375,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4188679245283019,
"acc_stderr": 0.030365050829115205,
"acc_norm": 0.4188679245283019,
"acc_norm_stderr": 0.030365050829115205
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3402777777777778,
"acc_stderr": 0.03962135573486219,
"acc_norm": 0.3402777777777778,
"acc_norm_stderr": 0.03962135573486219
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793254,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793254
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.38620689655172413,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.38620689655172413,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23544973544973544,
"acc_stderr": 0.021851509822031722,
"acc_norm": 0.23544973544973544,
"acc_norm_stderr": 0.021851509822031722
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.49032258064516127,
"acc_stderr": 0.028438677998909558,
"acc_norm": 0.49032258064516127,
"acc_norm_stderr": 0.028438677998909558
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4121212121212121,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.4121212121212121,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4494949494949495,
"acc_stderr": 0.0354413249194797,
"acc_norm": 0.4494949494949495,
"acc_norm_stderr": 0.0354413249194797
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5544041450777202,
"acc_stderr": 0.03587014986075659,
"acc_norm": 0.5544041450777202,
"acc_norm_stderr": 0.03587014986075659
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4461538461538462,
"acc_stderr": 0.025203571773028333,
"acc_norm": 0.4461538461538462,
"acc_norm_stderr": 0.025203571773028333
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40756302521008403,
"acc_stderr": 0.03191863374478465,
"acc_norm": 0.40756302521008403,
"acc_norm_stderr": 0.03191863374478465
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5266055045871559,
"acc_stderr": 0.021406952688151577,
"acc_norm": 0.5266055045871559,
"acc_norm_stderr": 0.021406952688151577
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.0346022832723917,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.0346022832723917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.459915611814346,
"acc_stderr": 0.03244246810187913,
"acc_norm": 0.459915611814346,
"acc_norm_stderr": 0.03244246810187913
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4484304932735426,
"acc_stderr": 0.03337883736255098,
"acc_norm": 0.4484304932735426,
"acc_norm_stderr": 0.03337883736255098
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48854961832061067,
"acc_stderr": 0.04384140024078016,
"acc_norm": 0.48854961832061067,
"acc_norm_stderr": 0.04384140024078016
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5041322314049587,
"acc_stderr": 0.04564198767432754,
"acc_norm": 0.5041322314049587,
"acc_norm_stderr": 0.04564198767432754
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3374233128834356,
"acc_stderr": 0.03714908409935575,
"acc_norm": 0.3374233128834356,
"acc_norm_stderr": 0.03714908409935575
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.5533980582524272,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.5533980582524272,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5341880341880342,
"acc_stderr": 0.03267942734081228,
"acc_norm": 0.5341880341880342,
"acc_norm_stderr": 0.03267942734081228
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5108556832694764,
"acc_stderr": 0.017875748840242407,
"acc_norm": 0.5108556832694764,
"acc_norm_stderr": 0.017875748840242407
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3959537572254335,
"acc_stderr": 0.02632981334194624,
"acc_norm": 0.3959537572254335,
"acc_norm_stderr": 0.02632981334194624
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30837988826815643,
"acc_stderr": 0.0154457169109989,
"acc_norm": 0.30837988826815643,
"acc_norm_stderr": 0.0154457169109989
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.028332397483664267,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.028332397483664267
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4437299035369775,
"acc_stderr": 0.02821768355665232,
"acc_norm": 0.4437299035369775,
"acc_norm_stderr": 0.02821768355665232
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4228395061728395,
"acc_stderr": 0.027487472980871598,
"acc_norm": 0.4228395061728395,
"acc_norm_stderr": 0.027487472980871598
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32978723404255317,
"acc_stderr": 0.028045946942042398,
"acc_norm": 0.32978723404255317,
"acc_norm_stderr": 0.028045946942042398
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.30247718383311606,
"acc_stderr": 0.011731524234165703,
"acc_norm": 0.30247718383311606,
"acc_norm_stderr": 0.011731524234165703
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.03025437257397669,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.03025437257397669
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.32679738562091504,
"acc_stderr": 0.018975427920507215,
"acc_norm": 0.32679738562091504,
"acc_norm_stderr": 0.018975427920507215
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.04782001791380063,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.04782001791380063
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4857142857142857,
"acc_stderr": 0.03199615232806287,
"acc_norm": 0.4857142857142857,
"acc_norm_stderr": 0.03199615232806287
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5223880597014925,
"acc_stderr": 0.035319879302087305,
"acc_norm": 0.5223880597014925,
"acc_norm_stderr": 0.035319879302087305
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3493975903614458,
"acc_stderr": 0.03711725190740749,
"acc_norm": 0.3493975903614458,
"acc_norm_stderr": 0.03711725190740749
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.03733756969066165,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.03733756969066165
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023498,
"mc2": 0.3592775546075334,
"mc2_stderr": 0.013858573967213928
},
"harness|winogrande|5": {
"acc": 0.7071823204419889,
"acc_stderr": 0.012789321118542613
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MartinKu/bookcorpus_stage2_coverage_100000 | ---
dataset_info:
features:
- name: text
dtype: string
- name: S_V_position
sequence: int64
- name: O_C_position
sequence: int64
- name: start_point_list
sequence: int64
splits:
- name: train
num_bytes: 60883646
num_examples: 99955
download_size: 7551557
dataset_size: 60883646
---
# Dataset Card for "bookcorpus_stage2_coverage_100000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
uatafaque/galamba | ---
license: openrail
---
|
open-llm-leaderboard/details_SanjiWatsuki__neural-chat-7b-v3-3-wizardmath-dare-me | ---
pretty_name: Evaluation run of SanjiWatsuki/neural-chat-7b-v3-3-wizardmath-dare-me
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SanjiWatsuki/neural-chat-7b-v3-3-wizardmath-dare-me](https://huggingface.co/SanjiWatsuki/neural-chat-7b-v3-3-wizardmath-dare-me)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SanjiWatsuki__neural-chat-7b-v3-3-wizardmath-dare-me\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-27T13:07:52.569856](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__neural-chat-7b-v3-3-wizardmath-dare-me/blob/main/results_2023-12-27T13-07-52.569856.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5841627607476706,\n\
\ \"acc_stderr\": 0.03324720041632836,\n \"acc_norm\": 0.5857534686922531,\n\
\ \"acc_norm_stderr\": 0.03390900076414602,\n \"mc1\": 0.4504283965728274,\n\
\ \"mc1_stderr\": 0.017417264371967646,\n \"mc2\": 0.6260346176960404,\n\
\ \"mc2_stderr\": 0.01578574508599339\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5588737201365188,\n \"acc_stderr\": 0.014509747749064664,\n\
\ \"acc_norm\": 0.5964163822525598,\n \"acc_norm_stderr\": 0.014337158914268447\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6515634335789683,\n\
\ \"acc_stderr\": 0.004755013243022125,\n \"acc_norm\": 0.8263294164509062,\n\
\ \"acc_norm_stderr\": 0.0037805175193024827\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365245,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365245\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562427,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562427\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.02437319786798306,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.02437319786798306\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6870967741935484,\n\
\ \"acc_stderr\": 0.02637756702864586,\n \"acc_norm\": 0.6870967741935484,\n\
\ \"acc_norm_stderr\": 0.02637756702864586\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39408866995073893,\n \"acc_stderr\": 0.03438157967036545,\n\
\ \"acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.03438157967036545\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713548,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713548\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.025158266016868578,\n\
\ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.025158266016868578\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.01709057380421791,\n \"\
acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.01709057380421791\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069432,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069432\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.02363687331748928,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.02363687331748928\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7854406130268199,\n\
\ \"acc_stderr\": 0.014680033956893346,\n \"acc_norm\": 0.7854406130268199,\n\
\ \"acc_norm_stderr\": 0.014680033956893346\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.02536116874968822,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.02536116874968822\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2212290502793296,\n\
\ \"acc_stderr\": 0.013882164598887277,\n \"acc_norm\": 0.2212290502793296,\n\
\ \"acc_norm_stderr\": 0.013882164598887277\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6437908496732027,\n \"acc_stderr\": 0.02742047766262924,\n\
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.02742047766262924\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\
\ \"acc_stderr\": 0.026795422327893937,\n \"acc_norm\": 0.6655948553054662,\n\
\ \"acc_norm_stderr\": 0.026795422327893937\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.02577311116963045,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.02577311116963045\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778852,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778852\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40352020860495436,\n\
\ \"acc_stderr\": 0.012530241301193184,\n \"acc_norm\": 0.40352020860495436,\n\
\ \"acc_norm_stderr\": 0.012530241301193184\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6062091503267973,\n \"acc_stderr\": 0.019766211991073066,\n \
\ \"acc_norm\": 0.6062091503267973,\n \"acc_norm_stderr\": 0.019766211991073066\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932264,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932264\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4504283965728274,\n\
\ \"mc1_stderr\": 0.017417264371967646,\n \"mc2\": 0.6260346176960404,\n\
\ \"mc2_stderr\": 0.01578574508599339\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7166535122336227,\n \"acc_stderr\": 0.012664751735505323\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5701288855193328,\n \
\ \"acc_stderr\": 0.013636344017393732\n }\n}\n```"
repo_url: https://huggingface.co/SanjiWatsuki/neural-chat-7b-v3-3-wizardmath-dare-me
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|arc:challenge|25_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|gsm8k|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hellaswag|10_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T13-07-52.569856.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-27T13-07-52.569856.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- '**/details_harness|winogrande|5_2023-12-27T13-07-52.569856.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-27T13-07-52.569856.parquet'
- config_name: results
data_files:
- split: 2023_12_27T13_07_52.569856
path:
- results_2023-12-27T13-07-52.569856.parquet
- split: latest
path:
- results_2023-12-27T13-07-52.569856.parquet
---
# Dataset Card for Evaluation run of SanjiWatsuki/neural-chat-7b-v3-3-wizardmath-dare-me
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SanjiWatsuki/neural-chat-7b-v3-3-wizardmath-dare-me](https://huggingface.co/SanjiWatsuki/neural-chat-7b-v3-3-wizardmath-dare-me) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SanjiWatsuki__neural-chat-7b-v3-3-wizardmath-dare-me",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-27T13:07:52.569856](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__neural-chat-7b-v3-3-wizardmath-dare-me/blob/main/results_2023-12-27T13-07-52.569856.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5841627607476706,
"acc_stderr": 0.03324720041632836,
"acc_norm": 0.5857534686922531,
"acc_norm_stderr": 0.03390900076414602,
"mc1": 0.4504283965728274,
"mc1_stderr": 0.017417264371967646,
"mc2": 0.6260346176960404,
"mc2_stderr": 0.01578574508599339
},
"harness|arc:challenge|25": {
"acc": 0.5588737201365188,
"acc_stderr": 0.014509747749064664,
"acc_norm": 0.5964163822525598,
"acc_norm_stderr": 0.014337158914268447
},
"harness|hellaswag|10": {
"acc": 0.6515634335789683,
"acc_stderr": 0.004755013243022125,
"acc_norm": 0.8263294164509062,
"acc_norm_stderr": 0.0037805175193024827
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365245,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365245
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562427,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562427
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.04164188720169377,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.04164188720169377
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.02437319786798306,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.02437319786798306
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6870967741935484,
"acc_stderr": 0.02637756702864586,
"acc_norm": 0.6870967741935484,
"acc_norm_stderr": 0.02637756702864586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.03438157967036545,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.03438157967036545
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713548,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713548
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.025158266016868578,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.025158266016868578
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.01709057380421791,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.01709057380421791
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069432,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069432
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.02363687331748928,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.02363687331748928
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7854406130268199,
"acc_stderr": 0.014680033956893346,
"acc_norm": 0.7854406130268199,
"acc_norm_stderr": 0.014680033956893346
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.02536116874968822,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.02536116874968822
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2212290502793296,
"acc_stderr": 0.013882164598887277,
"acc_norm": 0.2212290502793296,
"acc_norm_stderr": 0.013882164598887277
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.02742047766262924,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.02742047766262924
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.026795422327893937,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.026795422327893937
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.02577311116963045,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.02577311116963045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778852,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778852
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40352020860495436,
"acc_stderr": 0.012530241301193184,
"acc_norm": 0.40352020860495436,
"acc_norm_stderr": 0.012530241301193184
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.625,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.625,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6062091503267973,
"acc_stderr": 0.019766211991073066,
"acc_norm": 0.6062091503267973,
"acc_norm_stderr": 0.019766211991073066
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932264,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932264
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4504283965728274,
"mc1_stderr": 0.017417264371967646,
"mc2": 0.6260346176960404,
"mc2_stderr": 0.01578574508599339
},
"harness|winogrande|5": {
"acc": 0.7166535122336227,
"acc_stderr": 0.012664751735505323
},
"harness|gsm8k|5": {
"acc": 0.5701288855193328,
"acc_stderr": 0.013636344017393732
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
PaulLoisel/mlp_splitted | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
dataset_info:
features:
- name: purchased_products
dtype: float64
- name: review_time_spent
dtype: float64
- name: label
dtype: int64
- name: text
dtype: string
- name: Beauty Products
dtype: float64
- name: Furniture
dtype: float64
- name: Toys
dtype: float64
splits:
- name: train
num_bytes: 636.6
num_examples: 3
- name: test
num_bytes: 212.2
num_examples: 1
- name: val
num_bytes: 212.2
num_examples: 1
download_size: 15368
dataset_size: 1061.0
---
# Dataset Card for "mlp_splitted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_sst2_doubly_filled_comp | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: test
num_bytes: 783
num_examples: 4
- name: train
num_bytes: 4525
num_examples: 32
download_size: 6646
dataset_size: 5308
---
# Dataset Card for "MULTI_VALUE_sst2_doubly_filled_comp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Xiangyun2018/GalaxySpectra0-10000 | ---
license: apache-2.0
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-100000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 653372
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gluten/gpk-captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 45126936.0
num_examples: 83
download_size: 45128569
dataset_size: 45126936.0
---
# Dataset Card for "gpk-captions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Yeva/armSum | ---
license: other
---
|
arazd/llama_features_cot | ---
license: openrail
---
Llama-2 representations extracted from CoT dataset samples (original order is preserved). Representations are extracted from the final layer and averaged across all tokens.
Dataset structure: key=sample id, value=feature vector in string format, with ";" separator. |
DeAllGamer/VARAG | ---
license: mit
task_categories:
- text-generation
- text-classification
- image-to-text
language:
- en
pretty_name: Varag_Dataset
--- |
SINAI/eSOLdomainGlobal | ---
license: cc-by-nc-sa-4.0
---
# DESCRIPCIÓN
Uno de los principales problemas del Análisis de Opiniones es la generación de recursos adaptados a un dominio concreto. eSOLdomainGlobal es un conjunto de listas de palabras indicadoras de opinión en español que abarcan 8 dominios distintos: coches, hoteles, lavadoras, libros, teléfonos móviles, música, ordenadores y películas. Las listas se han generado a partir del lexicón iSOL, y siguiendo un método basado en corpus tomando la versión española del corpus SFU Review Corpus se han generado las 8 listas. |
Elriggs/openwebtext-100k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 497257202
num_examples: 100000
download_size: 302558045
dataset_size: 497257202
---
# Dataset Card for "openwebtext-100k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TinyPixel/o-mini | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 64170652
num_examples: 56037
download_size: 31497758
dataset_size: 64170652
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "o-mini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chansung/lm_response_test3 | ---
dataset_info:
features:
- name: instructions
dtype: string
- name: target_responses
dtype: string
- name: candidate_responses
dtype: string
splits:
- name: test_split
num_bytes: 26668
num_examples: 16
download_size: 27526
dataset_size: 26668
configs:
- config_name: default
data_files:
- split: test_split
path: data/test_split-*
---
# Dataset Card for "lm_response_test3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wttdotm/AYTA_Datasets | ---
tags:
- Reddit
- OpenAI
- GPT-3
- Davinci-002
- PRAW
- PMAW
size_categories:
- 10K<n<100K
---
# Are You The Asshole Training Data
These are the datasets used for a project Alex Petros and I made called [AreYouTheAsshole.com](https://www.areyoutheasshole.com). The site is intended to give users a fun and interactive way to experience the effect of bias in AI due to skewed data. We achieved this by fine-tuning three GPT-3 Davinci-002 models on the prompt/completion pairs you see here.
Each prompt/completion pair constitutes a post body (the prompt) and a comment (the completion). Just as there may be multiple comments to a single post, there may be multiple completions for a single prompt.
The dataset was filtered down from >100,000 post/comment pairs to only those whose comments started with a clear acronym judgement. So, comments like "Well I think YTA because..." were filtered out, whereas comments like "YTA and it's not even close..." were kept.
After filtering for clear judgement, we had our neutral dataset, the one you can find in "Neutral_Dataset.jsonl". In order to create intentionally biased data, we then split that dataset into two subsets based on whether a given post/comment pair's comment judged the poster as The Asshole or Not The Asshole. Some edge cases were also filtered out.
The dataset contains three sets:
- Neutral_Dataset.jsonl (contains all clear judgements, YTA, NTA, etc.)
- YTA_Dataset.jsonl (only contains judgements of YTA or similar)
- NTA_Dataset.jsonl (only contains judgements of NTA or similar)
### Data Collection:
This data was collected from Reddit's r/AmITheAsshole subreddit using PMAW/PRAW and the Reddit API |
edbeeching/godot_rl_AirHockey | ---
library_name: godot-rl
tags:
- deep-reinforcement-learning
- reinforcement-learning
- godot-rl
- environments
- video-games
---
A RL environment called AirHockey for the Godot Game Engine.
This environment was created with: https://github.com/edbeeching/godot_rl_agents
## Downloading the environment
After installing Godot RL Agents, download the environment with:
```
gdrl.env_from_hub -r edbeeching/godot_rl_AirHockey
```
|
kye/all-edwardzhang-python-code | ---
license: mit
---
|
betogaunt/minhasvozes | ---
license: openrail
---
|
Sampson2022/demo2 | ---
license: apache-2.0
---
|
hippocrates/CitationGPTZero_test | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 660104
num_examples: 589
- name: valid
num_bytes: 660104
num_examples: 589
- name: test
num_bytes: 660104
num_examples: 589
download_size: 860133
dataset_size: 1980312
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
ToniAqqia/chico_synthetic | ---
license: mit
---
|
corralm/awesome-prompt-patterns | ---
license: cc
task_categories:
- text-generation
language:
- en
tags:
- code
size_categories:
- n<1K
---
# 💬 Awesome Prompt Patterns
Prompt patterns are instructions guiding AI responses for specific tasks and are defined by core contextual statements that enhance the precision and relevancy of an output from an LLM.
View more prompt patterns and techniques on [GitHub](https://github.com/corralm/awesome-prompting).
---
license: cc
--- |
CAiRE/prosocial-dialog-kor_Hang | ---
dataset_info:
features:
- name: context
dtype: string
- name: response
dtype: string
- name: rots
sequence: string
- name: safety_label
dtype: string
- name: safety_annotations
sequence: string
- name: safety_annotation_reasons
sequence: string
- name: source
dtype: string
- name: etc
dtype: string
- name: dialogue_id
dtype: int64
- name: response_id
dtype: int64
- name: episode_done
dtype: bool
- name: mt_context
dtype: string
splits:
- name: train
num_bytes: 78576351
num_examples: 120236
- name: validation
num_bytes: 13338951
num_examples: 20416
- name: test
num_bytes: 16306444
num_examples: 25029
download_size: 50246041
dataset_size: 108221746
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_lloorree__kssht-castor-70b | ---
pretty_name: Evaluation run of lloorree/kssht-castor-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lloorree/kssht-castor-70b](https://huggingface.co/lloorree/kssht-castor-70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lloorree__kssht-castor-70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-18T23:54:47.734205](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-castor-70b/blob/main/results_2023-09-18T23-54-47.734205.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7025630433354887,\n\
\ \"acc_stderr\": 0.03070323641112233,\n \"acc_norm\": 0.7065431366848456,\n\
\ \"acc_norm_stderr\": 0.03067233267965294,\n \"mc1\": 0.40024479804161567,\n\
\ \"mc1_stderr\": 0.01715160555574914,\n \"mc2\": 0.5630669446354012,\n\
\ \"mc2_stderr\": 0.014865953800030475\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6501706484641638,\n \"acc_stderr\": 0.01393680921215829,\n\
\ \"acc_norm\": 0.6953924914675768,\n \"acc_norm_stderr\": 0.01344952210993249\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6857199761003784,\n\
\ \"acc_stderr\": 0.004632797375289762,\n \"acc_norm\": 0.8753236407090221,\n\
\ \"acc_norm_stderr\": 0.003296764320821918\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8486842105263158,\n \"acc_stderr\": 0.02916263159684399,\n\
\ \"acc_norm\": 0.8486842105263158,\n \"acc_norm_stderr\": 0.02916263159684399\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n\
\ \"acc_stderr\": 0.030085743248565666,\n \"acc_norm\": 0.8472222222222222,\n\
\ \"acc_norm_stderr\": 0.030085743248565666\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6978723404255319,\n \"acc_stderr\": 0.030017554471880557,\n\
\ \"acc_norm\": 0.6978723404255319,\n \"acc_norm_stderr\": 0.030017554471880557\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03960933549451207,\n\
\ \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03960933549451207\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\"\
: 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8193548387096774,\n \"acc_stderr\": 0.021886178567172523,\n \"\
acc_norm\": 0.8193548387096774,\n \"acc_norm_stderr\": 0.021886178567172523\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5566502463054187,\n \"acc_stderr\": 0.03495334582162933,\n \"\
acc_norm\": 0.5566502463054187,\n \"acc_norm_stderr\": 0.03495334582162933\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.029311188674983134,\n\
\ \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.029311188674983134\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880236,\n \"\
acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880236\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607555,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607555\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7128205128205128,\n \"acc_stderr\": 0.022939925418530616,\n\
\ \"acc_norm\": 0.7128205128205128,\n \"acc_norm_stderr\": 0.022939925418530616\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.027553614467863804,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.027553614467863804\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9027522935779817,\n \"acc_stderr\": 0.012703533408540366,\n \"\
acc_norm\": 0.9027522935779817,\n \"acc_norm_stderr\": 0.012703533408540366\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.01831885585008968,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.01831885585008968\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884565,\n \
\ \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884565\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342337,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342337\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.035207039905179635,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.035207039905179635\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.03044677768797173,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.03044677768797173\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.01911989279892498,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.01911989279892498\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542126,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542126\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8684546615581098,\n\
\ \"acc_stderr\": 0.01208670521425043,\n \"acc_norm\": 0.8684546615581098,\n\
\ \"acc_norm_stderr\": 0.01208670521425043\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.021855255263421795,\n\
\ \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.021855255263421795\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5463687150837989,\n\
\ \"acc_stderr\": 0.016650437588269076,\n \"acc_norm\": 0.5463687150837989,\n\
\ \"acc_norm_stderr\": 0.016650437588269076\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023805186524888156,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023805186524888156\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n\
\ \"acc_stderr\": 0.0238393033113982,\n \"acc_norm\": 0.7717041800643086,\n\
\ \"acc_norm_stderr\": 0.0238393033113982\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.020263764996385717,\n\
\ \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.020263764996385717\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5638297872340425,\n \"acc_stderr\": 0.029583452036284076,\n \
\ \"acc_norm\": 0.5638297872340425,\n \"acc_norm_stderr\": 0.029583452036284076\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5541069100391134,\n\
\ \"acc_stderr\": 0.012695244711379783,\n \"acc_norm\": 0.5541069100391134,\n\
\ \"acc_norm_stderr\": 0.012695244711379783\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.0265565194700415,\n\
\ \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.0265565194700415\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7679738562091504,\n \"acc_stderr\": 0.01707737337785693,\n \
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.01707737337785693\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8163265306122449,\n \"acc_stderr\": 0.024789071332007636,\n\
\ \"acc_norm\": 0.8163265306122449,\n \"acc_norm_stderr\": 0.024789071332007636\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.022076326101824664,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.022076326101824664\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.02567934272327692,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.02567934272327692\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40024479804161567,\n\
\ \"mc1_stderr\": 0.01715160555574914,\n \"mc2\": 0.5630669446354012,\n\
\ \"mc2_stderr\": 0.014865953800030475\n }\n}\n```"
repo_url: https://huggingface.co/lloorree/kssht-castor-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|arc:challenge|25_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hellaswag|10_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T23-54-47.734205.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-54-47.734205.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T23-54-47.734205.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-18T23-54-47.734205.parquet'
- config_name: results
data_files:
- split: 2023_09_18T23_54_47.734205
path:
- results_2023-09-18T23-54-47.734205.parquet
- split: latest
path:
- results_2023-09-18T23-54-47.734205.parquet
---
# Dataset Card for Evaluation run of lloorree/kssht-castor-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lloorree/kssht-castor-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lloorree/kssht-castor-70b](https://huggingface.co/lloorree/kssht-castor-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lloorree__kssht-castor-70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T23:54:47.734205](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-castor-70b/blob/main/results_2023-09-18T23-54-47.734205.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7025630433354887,
"acc_stderr": 0.03070323641112233,
"acc_norm": 0.7065431366848456,
"acc_norm_stderr": 0.03067233267965294,
"mc1": 0.40024479804161567,
"mc1_stderr": 0.01715160555574914,
"mc2": 0.5630669446354012,
"mc2_stderr": 0.014865953800030475
},
"harness|arc:challenge|25": {
"acc": 0.6501706484641638,
"acc_stderr": 0.01393680921215829,
"acc_norm": 0.6953924914675768,
"acc_norm_stderr": 0.01344952210993249
},
"harness|hellaswag|10": {
"acc": 0.6857199761003784,
"acc_stderr": 0.004632797375289762,
"acc_norm": 0.8753236407090221,
"acc_norm_stderr": 0.003296764320821918
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8486842105263158,
"acc_stderr": 0.02916263159684399,
"acc_norm": 0.8486842105263158,
"acc_norm_stderr": 0.02916263159684399
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565666,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6978723404255319,
"acc_stderr": 0.030017554471880557,
"acc_norm": 0.6978723404255319,
"acc_norm_stderr": 0.030017554471880557
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03960933549451207,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03960933549451207
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172523,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172523
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5566502463054187,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.5566502463054187,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.029311188674983134,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.029311188674983134
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880236,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880236
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607555,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607555
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7128205128205128,
"acc_stderr": 0.022939925418530616,
"acc_norm": 0.7128205128205128,
"acc_norm_stderr": 0.022939925418530616
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.027553614467863804,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.027553614467863804
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9027522935779817,
"acc_stderr": 0.012703533408540366,
"acc_norm": 0.9027522935779817,
"acc_norm_stderr": 0.012703533408540366
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.01831885585008968,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.01831885585008968
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.020681745135884565,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.020681745135884565
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342337,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342337
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.035207039905179635,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.035207039905179635
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.03044677768797173,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.03044677768797173
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.01911989279892498,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.01911989279892498
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8684546615581098,
"acc_stderr": 0.01208670521425043,
"acc_norm": 0.8684546615581098,
"acc_norm_stderr": 0.01208670521425043
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.021855255263421795,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.021855255263421795
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5463687150837989,
"acc_stderr": 0.016650437588269076,
"acc_norm": 0.5463687150837989,
"acc_norm_stderr": 0.016650437588269076
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.023805186524888156,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.023805186524888156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.0238393033113982,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.0238393033113982
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.020263764996385717,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.020263764996385717
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5638297872340425,
"acc_stderr": 0.029583452036284076,
"acc_norm": 0.5638297872340425,
"acc_norm_stderr": 0.029583452036284076
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5541069100391134,
"acc_stderr": 0.012695244711379783,
"acc_norm": 0.5541069100391134,
"acc_norm_stderr": 0.012695244711379783
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.0265565194700415,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.0265565194700415
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.01707737337785693,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.01707737337785693
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8163265306122449,
"acc_stderr": 0.024789071332007636,
"acc_norm": 0.8163265306122449,
"acc_norm_stderr": 0.024789071332007636
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824664,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824664
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.02567934272327692,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.02567934272327692
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40024479804161567,
"mc1_stderr": 0.01715160555574914,
"mc2": 0.5630669446354012,
"mc2_stderr": 0.014865953800030475
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hojzas/proj8-chatML | ---
license: apache-2.0
---
|
RussianNLP/wikiomnia | ---
license: apache-2.0
dataset_info:
- config_name: wikiomnia_ruT5_raw
features:
- name: title
dtype: string
- name: categories
dtype: string
- name: summary
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: batch_id
dtype: string
splits:
- name: dev
num_bytes: 600356136
num_examples: 266295
- name: test
num_bytes: 572651444
num_examples: 267751
download_size: 1204094848
dataset_size: 1173007580
- config_name: wikiomnia_ruT5_filtered
features:
- name: title
dtype: string
- name: categories
dtype: string
- name: summary
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: batch_id
dtype: string
splits:
- name: train
num_bytes: 4157093224
num_examples: 2088027
download_size: 4278635364
dataset_size: 4157093224
- config_name: wikiomnia_ruGPT3_filtered
features:
- name: title
dtype: string
- name: categories
dtype: string
- name: summary
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: batch_id
dtype: string
splits:
- name: train
num_bytes: 338607635
num_examples: 173314
download_size: 348694031
dataset_size: 338607635
- config_name: wikiomnia_ruGPT3_raw
features:
- name: title
dtype: string
- name: categories
dtype: string
- name: summary
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: batch_id
dtype: string
splits:
- name: train_batch1
num_bytes: 553204785
num_examples: 260808
- name: train_batch2
num_bytes: 542823205
num_examples: 263599
- name: train_batch3
num_bytes: 582321994
num_examples: 269736
- name: train_batch4
num_bytes: 543315355
num_examples: 265948
- name: train_batch5
num_bytes: 513288049
num_examples: 268466
- name: train_batch6
num_bytes: 943556173
num_examples: 512147
- name: train_batch7
num_bytes: 929464509
num_examples: 508149
- name: train_batch8
num_bytes: 915128725
num_examples: 507559
- name: train_batch9
num_bytes: 926443048
num_examples: 504292
- name: train_batch10
num_bytes: 834958539
num_examples: 463812
- name: train_batch11
num_bytes: 509866027
num_examples: 287770
- name: train_batch12
num_bytes: 478843738
num_examples: 271410
- name: train_batch13
num_bytes: 757068702
num_examples: 385730
- name: train_batch14
num_bytes: 575937629
num_examples: 304110
- name: train_batch15
num_bytes: 517092031
num_examples: 277507
- name: train_batch16
num_bytes: 759363156
num_examples: 402203
- name: train_batch17
num_bytes: 860544388
num_examples: 466572
- name: train_batch18
num_bytes: 935985528
num_examples: 518348
- name: train_batch19
num_bytes: 936782197
num_examples: 514307
- name: train_batch20
num_bytes: 874299949
num_examples: 487238
download_size: 14939875008
dataset_size: 14490287727
- config_name: wikiomnia_ruT5_raw_train
features:
- name: title
dtype: string
- name: categories
dtype: string
- name: summary
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: batch_id
dtype: string
splits:
- name: train_batch3
num_bytes: 612693602
num_examples: 271391
- name: train_batch4
num_bytes: 570286147
num_examples: 265947
- name: train_batch5
num_bytes: 552502041
num_examples: 274650
- name: train_batch6
num_bytes: 1017066184
num_examples: 525224
- name: train_batch7
num_bytes: 972351430
num_examples: 509615
- name: train_batch8
num_bytes: 973314180
num_examples: 516828
- name: train_batch9
num_bytes: 981651841
num_examples: 512709
- name: train_batch10
num_bytes: 880664685
num_examples: 469512
- name: train_batch11
num_bytes: 543971388
num_examples: 294631
- name: train_batch12
num_bytes: 503939060
num_examples: 273526
- name: train_batch13
num_bytes: 794421530
num_examples: 392021
- name: train_batch14
num_bytes: 610815879
num_examples: 311452
- name: train_batch15
num_bytes: 540225492
num_examples: 278677
- name: train_batch16
num_bytes: 804003566
num_examples: 411192
- name: train_batch17
num_bytes: 903347135
num_examples: 469871
- name: train_batch18
num_bytes: 995239085
num_examples: 528301
- name: train_batch19
num_bytes: 1003402360
num_examples: 522264
- name: train_batch20
num_bytes: 948137237
num_examples: 499866
download_size: 14634332336
dataset_size: 14208032842
task_categories:
- question-answering
language:
- ru
tags:
- wikipedia
- wikiomnia
- squad
- QA
pretty_name: WikiOmnia
size_categories:
- 1M<n<10M
---
# Dataset Card for "Wikiomnia"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Structure](#dataset-structure)
- [Dataset Creation](#dataset-creation)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** [https://github.com/RussianNLP](https://github.com/RussianNLP)
- **Paper:** [WikiOmnia: filtration and evaluation of the generated QA corpus on the whole Russian Wikipedia](https://arxiv.org/abs/2204.08009)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
We present the WikiOmnia dataset, a new publicly available set of QA-pairs and corresponding Russian Wikipedia article summary sections, composed with a fully automated generative pipeline. The dataset includes every available article from Wikipedia for the Russian language. The WikiOmnia pipeline is available open-source and is also tested for creating SQuAD-formatted QA on other domains, like news texts, fiction, and social media. The resulting dataset includes two parts: raw data on the whole Russian Wikipedia (7,930,873 QA pairs with paragraphs for ruGPT-3 XL and 7,991,040 QA pairs with paragraphs for ruT5-large) and cleaned data with strict automatic verification (over 160,000 QA pairs with paragraphs for ruGPT-3 XL and over 3,400,000 QA pairs with paragraphs for ruT5-large).
WikiOmnia consists of 2 parts:
1. the voluminous, automatically generated part: 15,9 million triplets consisting of the original article summary, a corresponding generated question and a generated answer;
2. the filtered part: the subsample of 3,5 million triplets, fully verified with automatic means
Wikiomnia adheres to a standard SQuAD format problem, resulting in triplets "text paragraph - question based on paragraph - answer from the paragraph", see the following example:
**Original Wikipedia paragraph**: Коити Масимо (яп. Масимо Ко:ити) — известный режиссёр аниме и основатель японской анимационной студии Bee Train. С
момента основания студии он руководит производством почти всех её картин, а также время от времени принимает участие в работе над анимацией и музыкой.
**English translation**: Koichi Mashimo is a famous anime director and the founder of the Japanese animation studio Bee Train. Since the creation of the studio, he directed almost all studio’s works, and he
also sometimes participates in art and sound tasks.
**Generated question (ruT5)**: Кто является основателем японской анимационной студии Bee Train?
**Generated answer (ruT5)**: Коити Масимо
**English QA translation**: Who is the founder of the Japanese animation studio Bee Train? Koichi Mashimo
## Dataset Creation
Models used for dataset generation:
- [ruT5](https://huggingface.co/sberbank-ai/ruT5-large) large fine-tuned on SberQuaD
- [ruGPT-3](https://huggingface.co/sberbank-ai/rugpt3xl) XL fine-tuned on SberQuaD
- [ruBERT](http://docs.deeppavlov.ai/en/master/features/models/squad.html) DeepPavlov tuned for QA tasks
Source: Wikipedia version March 2021
Special tokens: <[TEXT]>, <[QUESTION]>, <[ANSWER]>
The resulting dataset includes two parts: raw data on the whole Russian Wikipedia (7,930,873 QA pairs with paragraphs for ruGPT-3 XL and 7,991,040 QA pairs with paragraphs for ruT5-
large) and cleaned data with strict automatic verification (over 160,000 QA pairs with paragraphs for ruGPT-3 XL and over 3,400,000 QA pairs with paragraphs for ruT5-large).

## Additional Information
### Licensing Information
[Apache 2.0 license](https://github.com/RussianNLP/WikiOmnia/blob/main/LICENSE)
### Citation Information
```
@inproceedings{pisarevskaya-shavrina-2022-wikiomnia,
title = "{W}iki{O}mnia: filtration and evaluation of the generated {QA} corpus on the whole {R}ussian {W}ikipedia",
author = "Pisarevskaya, Dina and
Shavrina, Tatiana",
booktitle = "Proceedings of the 2nd Workshop on Natural Language Generation, Evaluation, and Metrics (GEM)",
month = dec,
year = "2022",
address = "Abu Dhabi, United Arab Emirates (Hybrid)",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.gem-1.10",
pages = "125--135",
abstract = "The General QA field has been developing the methodology referencing the Stanford Question answering dataset (SQuAD) as the significant benchmark. Compiling factual questions datasets requires manual annotations, limiting the training data{'}s potential size. We present the WikiOmnia dataset, a new publicly available set of QA pairs and corresponding Russian Wikipedia article summary sections, composed with a fully automated generation and filtration pipeline. To ensure high quality of generated QA pairs, diverse manual and automated evaluation techniques were applied. The WikiOmnia pipeline is available open-source and is also tested for creating SQuAD-formatted QA on other domains, like news texts, fiction, and social media. The resulting dataset includes two parts: raw data on the whole Russian Wikipedia (7,930,873 QA pairs with paragraphs for ruGPT-3 XL and 7,991,040 QA pairs with paragraphs for ruT5-large) and cleaned data with strict automatic verification (over 160,000 QA pairs with paragraphs for ruGPT-3 XL and over 3,400,000 QA pairs with paragraphs for ruT5-large).",
}
```
### Contributions
Thanks to [@Deenochka](https://github.com/deenochka), [@TatianaShavrina](https://github.com/TatianaShavrina) |
KTH/waxholm | ---
language:
- sv
task_categories:
- automatic-speech-recognition
---
# THE WAXHOLM CORPUS
The Waxholm corpus was collected in 1993 - 1994 at the department of
Speech, Hearing and Music (TMH), KTH. It is described in several
publications. Two are included in this archive. Publication of work
using the Waxholm corpus should refer to either of these. More
information on the Waxholm project can be found on the web page
http://www.speech.kth.se/waxholm/waxholm2.html
## FILE INFORMATION
### SAMPLED FILES
The .smp files contain the speech signal. The identity
of the speaker is coded by the two digits after 'fp20' in the file
name. The smp file format was developed by TMH. Recording information
is stored in a header as a 1024 byte text string. The speech signal in
the Waxholm corpus is quantised into 16 bits, 2 bytes/sample and the
byte order is big-endian (most significant byte first). The sampling
frequency is 16 kHz. Here is an example of a file header:
```
>head -9 fp2001.1.01.smp
file=samp ; file type is sampled signal
msb=first ; byte order
sftot=16000 ; sampling frequency in Hz
nchans=1 ; number of channels
preemph=no ; no signal preemphasis during recording
view=-10,10
born=/o/libhex/ad_da.h25
range=-12303,11168 ; amplitude range
=
```
### LABEL FILES
Normally, each sample file has a label file. This has been
produced in four steps. The first step was to manually enter the
orthographic text by listening. From this text a sequence of phonemes
were produced by a rule-based text-to-phoneme module. The endpoint
time positions of the phonemes were computed by an automatic alignment
program, followed by manual correction. Some of the speech files have
no label file, due to different problems in this process. These files
should not be used for training or testing.
The labels are stored in .mix files. Below is an example of the
beginning of a mix file.
```
>head -20 fp2001.1.01.smp.mix
CORRECTED: OK jesper Jesper Hogberg Thu Jun 22 13:26:26 EET 1995
AUTOLABEL: tony A. de Serpa-Leitao Mon Nov 15 13:44:30 MET 1993
Waxholm dialog. /u/wax/data/scenes/fp2001/fp2001.1.01.smp
TEXT:
jag vill }ka h{rifr}n .
J'A:+ V'IL+ "]:K'A H'[3RIFR]N.
CT 1
Labels: J'A: V'IL "]:KkA H'[3RIFR]N .
FR 11219 #J >pm #J >w jag 0.701 sec
FR 12565 $'A: >pm $'A:+ 0.785 sec
FR 13189 #V >pm #V >w vill 0.824 sec
FR 13895 $'I >pm $'I 0.868 sec
FR 14700 $L >pm $L+ 0.919 sec
```
The orthographic text representation is after the label 'TEXT:' CT is
the frame length in number of sample points. (Always = 1 in Waxholm
mix files) Each line starting with 'FR' contains up to three labels at
the phonetic, phonemic and word levels. FR is immediately followed by
the frame number of the start of the segment. Since CT = 1, FR is the
sample index in the file. If a frame duration is = 0, the label has
been judged as a non-pronounced segment and deleted by the manual
labeller, although it was generated by the text-to-phoneme or the
automatic alignment modules. Column 3 in an FR line is the phonetic
label. Initial '#' indicates word initial position. '$' indicates
other positions. The optional label '>pm' precedes the phonemic label,
which has been generated by the text-to-phoneme rules. Often, the
phonemic and the phonetic labels are identical. The optional '>w' is
followed by the identity of the word beginning at this frame. The
phoneme symbol inventory is mainly STA, used by the KTH/TMH RULSYS
system. It is specified in the included file 'sampa_latex_se.pdf'.
Some extra labels at the phonetic level have been defined.
The most common ones are:
| | |
|---------------------|------------------------------------------|
|sm | lip or tongue opening |
|p: | silent interval |
|pa | aspirative sound from breathing |
|kl | click sound |
|v | short vocalic segment between consonants |
|upper case of stops | occlusion |
|lower case of stops | burst |
The label 'Labels:' before the FR lines is a text string assembled
from the FR labels
The mix files in this archive correspond to those with the name
extension .mix.new in the original corpus. Besides a few other
corrections, the main difference is that burst segments after
retroflex stops were not labelled as retroflex in the original .mix
files ( d, t after 2D and 2T have been changed to 2d and 2t).
## REFERENCES
Bertenstam, J., Blomberg, M., Carlson, R., Elenius, K., Granström,
B., Gustafson, J., Hunnicutt, S., Högberg, J., Lindell, R., Neovius,
L., Nord, L., de Serpa-Leitao, A., and Ström, N.,(1995). "Spoken
dialogue data collected in the WAXHOLM project" STL-QPSR 1/1995,
KTH/TMH, Stockholm.
Bertenstam, J., Blomberg, M., Carlson, R.,
Elenius, K., Granström, B., Gustafson, J., Hunnicutt, S., Högberg, J.,
Lindell, R., Neovius, L., de Serpa-Leitao, A., Nord, L., & Ström,
N. (1995). The Waxholm application data-base. In Pardo, J.M. (Ed.),
Proceedings Eurospeech 1995 (pp. 833-836). Madrid.
Comments and error reports are welcome. These should be sent to:
Mats Blomberg <matsb@speech.kth.se> or Kjell Elenius <kjell@speech.kth.se> |
Multimodal-Fatima/VQAv2_sample_validation_facebook_opt_6.7b_mode_VQAv2_visclues_detection_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_16
num_bytes: 26699615
num_examples: 1000
download_size: 5515967
dataset_size: 26699615
---
# Dataset Card for "VQAv2_sample_validation_facebook_opt_6.7b_mode_VQAv2_visclues_detection_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Erynan/4_ethics_5 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 12149379
num_examples: 13629
download_size: 2262542
dataset_size: 12149379
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BrianWan221/trial | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': HMI
'1': euv
splits:
- name: train
num_bytes: 1341052823.0
num_examples: 81
download_size: 1317503216
dataset_size: 1341052823.0
---
# Dataset Card for "trial"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.