datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
CaterinaLac/sharegpt-deduplicated | ---
license: apache-2.0
task_categories:
- conversational
language:
- en
- zh
- ko
- fr
- ja
- es
- 'no'
- et
- de
- ca
- vi
- fi
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
## Dataset Description
### Dataset Summary
This dataset is a deduplicated version of [sharegpt4](https://huggingface.co/datasets/shibing624/sharegpt_gpt4).
<br>The deduplication process has two steps:<br>
1. The literal duplicates (both input and outputs) are removed
2. The remaining (5749) instances are embedded with the [SentenceTransformer library](https://www.sbert.net/) ("paraphrase-multilingual-mpnet-base-v2" model).
Then, we compute the cosine similarity among all the possible pairs, and consider paraphrases those pairs with a similarity > 0.95. For each paraphrase group, we only retain one element.
The resulting dataset has 5139 elements.
### Languages
The dataset includes several languages, but the vast majority of it is in English. Roughly 600 instances are in more than one language, as detected by [langdetect](https://pypi.org/project/langdetect/).
The languages that appear across the dataset, together with the number of instances they appear in, follow:
<details>
<summary>Language Distribution</summary>
en 4053<br>
zh-cn 423<br>
ko 333<br>
fr 168<br>
ja 151<br>
es 142<br>
no 110<br>
et 97<br>
de 81<br>
ca 78<br>
vi 63<br>
fi 52<br>
zh-tw 47<br>
pt 42<br>
tl 39<br>
ru 24<br>
he 24<br>
id 23<br>
it 22<br>
sv 21<br>
pl 16<br>
nl 16<br>
th 15<br>
ro 11<br>
da 9<br>
tr 8<br>
cs 8<br>
hr 6<br>
uk 5<br>
af 5<br>
ar 4<br>
bg 3<br>
cy 2<br>
sk 2<br>
hu 2<br>
so 2<br>
bn 1<br>
sl 1<br>
hi 1<br>
sw 1<br>
lv 1<br>
el 1<br>
</details>
### Data Fields
Each instance has two fields:
- 'input': one turn of a human-bot conversation, initiated by a human. It starts with 'Human: ', and it ends with 'Assistant: '
- 'output': the bot reply |
CyberHarem/silver_wolf_starrail | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of silver_wolf/銀狼/银狼/은랑 (Honkai: Star Rail)
This is the dataset of silver_wolf/銀狼/银狼/은랑 (Honkai: Star Rail), containing 500 images and their tags.
The core tags of this character are `grey_hair, long_hair, grey_eyes, bangs, eyewear_on_head, ponytail, drill_hair, hair_between_eyes, breasts, sunglasses, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1015.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silver_wolf_starrail/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 484.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silver_wolf_starrail/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1311 | 1.08 GiB | [Download](https://huggingface.co/datasets/CyberHarem/silver_wolf_starrail/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 849.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/silver_wolf_starrail/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1311 | 1.70 GiB | [Download](https://huggingface.co/datasets/CyberHarem/silver_wolf_starrail/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/silver_wolf_starrail',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, bubble_blowing, crop_top, midriff, open_jacket, solo, thighs, black_jacket, black_shorts, chewing_gum, fingerless_gloves, fur_trim, long_sleeves, looking_at_viewer, bare_shoulders, hair_ribbon, off_shoulder, short_shorts, stomach, thigh_strap, black_gloves, choker, micro_shorts, navel, sleeveless_shirt, white_shirt, fishnets, belt, boots, cleavage, jewelry, simple_background, sitting, small_breasts, white_background |
| 1 | 18 |  |  |  |  |  | 1girl, bubble_blowing, chewing_gum, fingerless_gloves, solo, looking_at_viewer, black_gloves, black_jacket, fur_trim, navel, black_shorts, midriff, shirt |
| 2 | 6 |  |  |  |  |  | 1girl, black_jacket, fingerless_gloves, looking_at_viewer, solo, goggles_on_head, shorts, black_gloves, closed_mouth, fur_trim, smile |
| 3 | 12 |  |  |  |  |  | chibi, 1girl, solo, black_footwear, holding_gun, short_sleeves, black_shirt, black_shorts, full_body, hair_ribbon, twin_drills |
| 4 | 6 |  |  |  |  |  | 1boy, 1girl, fingerless_gloves, hetero, jacket, penis, solo_focus, black_gloves, blush, cum_in_pussy, goggles_on_head, nipples, open_mouth, sex, vaginal, bottomless, collarbone, fur_trim, looking_at_viewer, shirt_lift, bar_censor, girl_on_top, medium_breasts, multicolored_hair, straddling, tank_top |
| 5 | 5 |  |  |  |  |  | 1girl, blush, completely_nude, indoors, solo, looking_at_viewer, nipples, open_mouth, gigantic_breasts, alternate_breast_size, collarbone, curvy, from_behind, huge_ass, huge_breasts, large_breasts, looking_back, navel, pregnant, sitting, thick_thighs, tinted_eyewear, wide_hips |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bubble_blowing | crop_top | midriff | open_jacket | solo | thighs | black_jacket | black_shorts | chewing_gum | fingerless_gloves | fur_trim | long_sleeves | looking_at_viewer | bare_shoulders | hair_ribbon | off_shoulder | short_shorts | stomach | thigh_strap | black_gloves | choker | micro_shorts | navel | sleeveless_shirt | white_shirt | fishnets | belt | boots | cleavage | jewelry | simple_background | sitting | small_breasts | white_background | shirt | goggles_on_head | shorts | closed_mouth | smile | chibi | black_footwear | holding_gun | short_sleeves | black_shirt | full_body | twin_drills | 1boy | hetero | jacket | penis | solo_focus | blush | cum_in_pussy | nipples | open_mouth | sex | vaginal | bottomless | collarbone | shirt_lift | bar_censor | girl_on_top | medium_breasts | multicolored_hair | straddling | tank_top | completely_nude | indoors | gigantic_breasts | alternate_breast_size | curvy | from_behind | huge_ass | huge_breasts | large_breasts | looking_back | pregnant | thick_thighs | tinted_eyewear | wide_hips |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-----------|:----------|:--------------|:-------|:---------|:---------------|:---------------|:--------------|:--------------------|:-----------|:---------------|:--------------------|:-----------------|:--------------|:---------------|:---------------|:----------|:--------------|:---------------|:---------|:---------------|:--------|:-------------------|:--------------|:-----------|:-------|:--------|:-----------|:----------|:--------------------|:----------|:----------------|:-------------------|:--------|:------------------|:---------|:---------------|:--------|:--------|:-----------------|:--------------|:----------------|:--------------|:------------|:--------------|:-------|:---------|:---------|:--------|:-------------|:--------|:---------------|:----------|:-------------|:------|:----------|:-------------|:-------------|:-------------|:-------------|:--------------|:-----------------|:--------------------|:-------------|:-----------|:------------------|:----------|:-------------------|:------------------------|:--------|:--------------|:-----------|:---------------|:----------------|:---------------|:-----------|:---------------|:-----------------|:------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 18 |  |  |  |  |  | X | X | | X | | X | | X | X | X | X | X | | X | | | | | | | X | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | | | | X | | X | | | X | X | | X | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 12 |  |  |  |  |  | X | | | | | X | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | | | | | | | | | X | X | | X | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | | X | | | | | | | | X | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | X | X | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Bluebomber182/Branch-From-Trolls | ---
license: unknown
---
|
ryanc/audio_align | ---
dataset_info:
features:
- name: caption
dtype: string
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 63369072189.92
num_examples: 38120
download_size: 28087027560
dataset_size: 63369072189.92
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "audio_align"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AtlasUnified/atlas-pdf-img-cluster | ---
task_categories:
- image-classification
- image-segmentation
- image-to-text
tags:
- OCR
- Text-Image Pairs
size_categories:
- 10M<n<100M
license: osl-3.0
language:
- en
pretty_name: Atlas PDF Image Cluster
---
# Atlas PDF Image Cluster Dataset
Derives from the following Python Pipeline code:
https://github.com/atlasunified/PDF-to-Image-Cluster
# Dataset Description
This dataset is a collection of text extracted from PDF files, originating from various online resources. The dataset was generated using a series of Python scripts forming a robust pipeline that automated the tasks of downloading, converting, and managing the data.
# Dataset Summary
Sample JPG

Corresponding JSON file with Bounding Box and Text data
```
Bounding box: [[0.10698689956331878, 0.008733624454148471], [0.7336244541484717, 0.008733624454148471], [0.7336244541484717, 0.06986899563318777], [0.10698689956331878, 0.06986899563318777]], Text: the Simchas Bais
Bounding box: [[0.013100436681222707, 0.12663755458515283], [0.7314410480349345, 0.12663755458515283], [0.7314410480349345, 0.1965065502183406], [0.013100436681222707, 0.1965065502183406]], Text: they are engaged in
Bounding box: [[0.0, 0.2445414847161572], [0.7379912663755459, 0.23580786026200873], [0.7379912663755459, 0.31222707423580787], [0.0, 0.31877729257641924]], Text: hey could become
Bounding box: [[0.008733624454148471, 0.36026200873362446], [0.7336244541484717, 0.36026200873362446], [0.7336244541484717, 0.425764192139738], [0.008733624454148471, 0.425764192139738]], Text: evil inclination still
Bounding box: [[0.004366812227074236, 0.48034934497816595], [0.31004366812227074, 0.4847161572052402], [0.31004366812227074, 0.5567685589519651], [0.004366812227074236, 0.5502183406113537]], Text: certainly
Bounding box: [[0.36899563318777295, 0.4890829694323144], [0.5480349344978166, 0.4890829694323144], [0.5480349344978166, 0.5524017467248908], [0.36899563318777295, 0.5524017467248908]], Text: men
Bounding box: [[0.5851528384279476, 0.4781659388646288], [0.740174672489083, 0.4781659388646288], [0.740174672489083, 0.5524017467248908], [0.5851528384279476, 0.5524017467248908]], Text: and
Bounding box: [[0.008733624454148471, 0.6004366812227074], [0.7336244541484717, 0.6004366812227074], [0.7336244541484717, 0.6681222707423581], [0.008733624454148471, 0.6681222707423581]], Text: e in separate areas.
Bounding box: [[0.9454148471615721, 0.6157205240174672], [0.9978165938864629, 0.6157205240174672], [0.9978165938864629, 0.6877729257641921], [0.9454148471615721, 0.6877729257641921]], Text: T
Bounding box: [[0.9519650655021834, 0.7532751091703057], [0.9978165938864629, 0.7532751091703057], [0.9978165938864629, 0.8078602620087336], [0.9519650655021834, 0.8078602620087336]], Text: 0
Bounding box: [[0.9475982532751092, 0.851528384279476], [0.9978165938864629, 0.851528384279476], [0.9978165938864629, 0.9235807860262009], [0.9475982532751092, 0.9235807860262009]], Text: fl\n
```
Sample JPG

Corresponding JSON file with Bounding Box and Text data
```
Bounding box: [[0.011570247933884297, 0.428099173553719], [0.9867768595041322, 0.428099173553719], [0.9867768595041322, 0.4677685950413223], [0.011570247933884297, 0.4677685950413223]], Text: tural person subiect to the reguirements laic
Bounding box: [[0.0049586776859504135, 0.5173553719008265], [0.9884297520661157, 0.5140495867768595], [0.9884297520661157, 0.5636363636363636], [0.0049586776859504135, 0.5669421487603306]], Text: priate, the provisions of sections 43 and 44;
Bounding box: [[0.009917355371900827, 0.6082644628099173], [0.9900826446280991, 0.6082644628099173], [0.9900826446280991, 0.6528925619834711], [0.009917355371900827, 0.6528925619834711]], Text: section 3. A person with no municipality of r
Bounding box: [[0.009917355371900827, 0.7041322314049587], [0.9917355371900827, 0.7041322314049587], [0.9917355371900827, 0.743801652892562], [0.009917355371900827, 0.743801652892562]], Text: ied by the authorities in their country of resi
Bounding box: [[0.0049586776859504135, 0.7917355371900826], [0.9917355371900827, 0.7950413223140496], [0.9917355371900827, 0.8396694214876033], [0.0049586776859504135, 0.8347107438016529]], Text: firearm or firearm component in question ir
```
Sample JPG

Corresponding JSON file with Bounding Box and Text data
```
Bounding box: [[0.19349005424954793, 0.5334538878842676], [0.7902350813743219, 0.5370705244122965], [0.7902350813743219, 0.5822784810126582], [0.19349005424954793, 0.5786618444846293]], Text: Generic Drug Description
Bounding box: [[0.19529837251356238, 0.6274864376130199], [0.9909584086799277, 0.6274864376130199], [0.9909584086799277, 0.6708860759493671], [0.19529837251356238, 0.6708860759493671]], Text: Carboxymethylcellulose Sodium (
```
# Supported Tasks and Use Cases
The primary use case of this dataset is to serve as training data for machine learning models that operate on text data. This may include, but is not limited to, text classification, information extraction, named entity recognition, and machine translation tasks.
# Dataset Creation
This dataset was generated through a multi-stage Python pipeline designed to handle the downloading, conversion, and management of large datasets.
Primary URLs for downloading comes from ROM1504's dataset at the following link: http://3080.rom1504.fr/n/text/text38M/
# Data Fields
As the dataset contains text extracted from PDF files from the common crawl. the data fields primarily include the extracted text and bounding box information. |
liuwei33/images | ---
license: mit
---
|
honggen/hh-hard-v5 | ---
license: apache-2.0
task_categories:
- text2text-generation
language:
- en
---
This data mix the hard pairs and random pairs. It consist of 25% hard pairs and 75% random pairs. |
vilsonrodrigues/identification-in-football | ---
license: cc-by-4.0
pretty_name: Identification in Football
size_categories:
- n<1K
---
Football player datasets, labeled by name. Made for facial identification experiments.
the images were taken from the web, they are all CC protected, so the license has been maintained |
teerath07/helpsteer_modified_new | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2644119
num_examples: 810
download_size: 1150062
dataset_size: 2644119
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nexdata/Korean_Conversational_Speech_Data_by_Mobile_Phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
task_categories:
- conversational
language:
- ko
---
# Dataset Card for Nexdata/Korean_Conversational_Speech_Data_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1103?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
About 700 Korean speakers participated in the recording, and conducted face-to-face communication in a natural way. They had free discussion on a number of given topics, with a wide range of fields; the voice was natural and fluent, in line with the actual dialogue scene. Text is transferred manually, with high accuracy.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1103?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Korean
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commercial License
### Citation Information
[More Information Needed]
### Contributions |
CyberHarem/becky_blackbell_spyxfamily | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Becky Blackbell
This is the dataset of Becky Blackbell, containing 158 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 158 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 318 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 158 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 158 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 158 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 158 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 158 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 318 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 318 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 318 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Francesco/grass-weeds | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': grass-weeds
'1': 0 ridderzuring
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: grass-weeds
tags:
- rf100
---
# Dataset Card for grass-weeds
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/grass-weeds
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
grass-weeds
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/grass-weeds
### Citation Information
```
@misc{ grass-weeds,
title = { grass weeds Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/grass-weeds } },
url = { https://universe.roboflow.com/object-detection/grass-weeds },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
DBQ/Chanel.Product.prices.United.States | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: United States - Chanel - Product-level price list
tags:
- webscraping
- ecommerce
- Chanel
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: string
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 748670
num_examples: 1452
download_size: 189766
dataset_size: 748670
---
# Chanel web scraped data
## About the website
The global luxury goods industry, specifically the high-end fashion sector, is a competitive marketplace where brands like **Chanel** thrive. The American market, especially the **United States**, plays a critical role in this industry, as it is one of the worlds biggest consumers of luxury products. With its affluent consumers propensity for luxury and up-scale products, the US market is a major driver of growth in this sector. The recent shift towards **Ecommerce** and digital platforms has further revolutionized luxury sales. The observed dataset provides valuable **Ecommerce product-list page (PLP)** data on Chanels operations in the United States, highlighting key trends and insights into their online strategies and performance.
## Link to **dataset**
[United States - Chanel - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Chanel%20Product-prices%20United%20States/r/recMdZB18HDhbWplC)
|
alec-groff/warehouse-detection | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: filename
dtype: string
- name: objects
struct:
- name: bbox
sequence:
sequence: float64
- name: categories
sequence: int64
splits:
- name: train
num_bytes: 2551114
num_examples: 4789
- name: validation
num_bytes: 216128
num_examples: 459
- name: test
num_bytes: 128909
num_examples: 229
download_size: 526880
dataset_size: 2896151
---
# Dataset Card for "warehouse-detection"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eci-io/climate-evaluation | ---
language:
- en
task_categories:
- text-classification
- multiple-choice
dataset_info:
- config_name: cdp_qa
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: idx
dtype: int32
splits:
- name: train
num_bytes: 397518015
num_examples: 548155
- name: validation
num_bytes: 58167638
num_examples: 78876
- name: test
num_bytes: 66654435
num_examples: 92652
download_size: 512401333
dataset_size: 522340088
- config_name: climate_eng
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
- name: idx
dtype: int32
splits:
- name: train
num_bytes: 625518
num_examples: 2871
- name: validation
num_bytes: 78234
num_examples: 354
- name: test
num_bytes: 81454
num_examples: 355
download_size: 743756
dataset_size: 785206
- config_name: climate_stance
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': 0
'1': 1
'2': 2
- name: idx
dtype: int32
splits:
- name: train
num_bytes: 625518
num_examples: 2871
- name: validation
num_bytes: 78234
num_examples: 354
- name: test
num_bytes: 81454
num_examples: 355
download_size: 743756
dataset_size: 785206
- config_name: climatext
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: idx
dtype: int32
splits:
- name: train
num_bytes: 847902
num_examples: 6000
- name: validation
num_bytes: 48406
num_examples: 300
- name: test
num_bytes: 260912
num_examples: 1600
download_size: 1385322
dataset_size: 1157220
- config_name: exams
features:
- name: subject
dtype: string
- name: question_stem
dtype: string
- name: choices
dtype: string
- name: answerKey
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 165711
num_examples: 484
download_size: 157661
dataset_size: 165711
- config_name: exeter
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: idx
dtype: int32
splits:
- name: train
num_bytes: 7495896
num_examples: 23436
- name: validation
num_bytes: 837247
num_examples: 2605
- name: test
num_bytes: 1053039
num_examples: 2904
download_size: 9071528
dataset_size: 9386182
- config_name: translated_exams
features:
- name: subject
dtype: string
- name: question_stem
dtype: string
- name: choices
dtype: string
- name: answerKey
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: idx
dtype: int32
splits:
- name: test
num_bytes: 132380
num_examples: 484
download_size: 125236
dataset_size: 132380
---
Datasets used in the ClimateGPT paper for Climate Evaluation.
* Github Repo containing Prompt Templates for Climate Evaluation: [https://github.com/eci-io/climategpt-evaluation](https://github.com/eci-io/climategpt-evaluation)
* Paper Link: [https://arxiv.org/abs/2401.09646](https://arxiv.org/abs/2401.09646)
### Citation Information
```
@misc{thulke2024climategpt,
title={ClimateGPT: Towards AI Synthesizing Interdisciplinary Research on Climate Change},
author={David Thulke and Yingbo Gao and Petrus Pelser and Rein Brune and Rricha Jalota and Floris Fok and Michael Ramos and Ian van Wyk and Abdallah Nasir and Hayden Goldstein and Taylor Tragemann and Katie Nguyen and Ariana Fowler and Andrew Stanco and Jon Gabriel and Jordan Taylor and Dean Moro and Evgenii Tsymbalov and Juliette de Waal and Evgeny Matusov and Mudar Yaghi and Mohammad Shihadah and Hermann Ney and Christian Dugast and Jonathan Dotan and Daniel Erasmus},
year={2024},
eprint={2401.09646},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
|
CyberHarem/zara_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of zara/ザラ/扎拉 (Azur Lane)
This is the dataset of zara/ザラ/扎拉 (Azur Lane), containing 500 images and their tags.
The core tags of this character are `breasts, red_hair, long_hair, bangs, large_breasts, side_ponytail, goggles_on_head, yellow_eyes, ahoge, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 901.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zara_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 439.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zara_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1326 | 1.01 GiB | [Download](https://huggingface.co/datasets/CyberHarem/zara_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 766.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zara_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1326 | 1.57 GiB | [Download](https://huggingface.co/datasets/CyberHarem/zara_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/zara_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 53 |  |  |  |  |  | twintails, beret, 1girl, black_headwear, cleavage, dress, looking_at_viewer, blush, white_gloves, solo, cape, smile, thighhighs, italian_flag, epaulettes, red_eyes, skindentation, simple_background |
| 1 | 6 |  |  |  |  |  | 1boy, 1girl, black_bikini, blush, collarbone, cowgirl_position, girl_on_top, goggles, hetero, looking_at_viewer, o-ring_bikini, sex, solo_focus, wet, bikini_bottom_aside, navel, nipples, vaginal, cleavage, open_mouth, outdoors, penis, pussy, sky, smile, day, mosaic_censoring, pov |
| 2 | 8 |  |  |  |  |  | 1girl, black_bikini, blush, cleavage, collarbone, looking_at_viewer, o-ring_bikini, smile, solo, goggles, wet, hair_between_eyes, official_alternate_costume, sidelocks, closed_mouth, huge_breasts, skindentation, bare_shoulders, water |
| 3 | 8 |  |  |  |  |  | 1girl, black_bikini, blush, cleavage, collarbone, looking_at_viewer, o-ring_bikini, solo, water, wet, official_alternate_costume, poolside, smile, bare_shoulders, navel, pool_ladder, goggles, open_mouth, skindentation, thighs, sidelocks, huge_breasts |
| 4 | 35 |  |  |  |  |  | 1girl, black_bikini, cleavage, looking_at_viewer, o-ring_bikini, collarbone, goggles, solo, blush, whistle_around_neck, smile, skindentation, wet, navel, multi-strapped_bikini, poolside, mouth_hold, water, official_alternate_costume, pool_ladder, sitting |
| 5 | 5 |  |  |  |  |  | 1girl, black_bikini, blush, cleavage, collarbone, day, goggles, looking_at_viewer, o-ring_bikini, outdoors, skindentation, smile, solo, wet, blue_sky, multi-strapped_bikini, navel, ocean, whistle_around_neck, italian_flag, official_alternate_costume, open_mouth, water, armpits, beach, covered_nipples, huge_breasts, kneeling |
| 6 | 13 |  |  |  |  |  | 1girl, black_bikini, cleavage, looking_at_viewer, multi-strapped_bikini, navel, o-ring_bikini, solo, simple_background, white_background, blush, skindentation, smile, wet, collarbone, goggles, official_alternate_costume, sidelocks, bare_shoulders |
| 7 | 8 |  |  |  |  |  | 1boy, 1girl, black_bikini, blush, collarbone, goggles, hetero, o-ring_bikini, solo_focus, wet, cleavage, paizuri, smile, cum_on_breasts, hair_between_eyes, huge_breasts, looking_at_viewer, pov, breasts_squeezed_together, open_mouth, penis, tongue_out, closed_mouth, nipples, sidelocks, water |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | twintails | beret | 1girl | black_headwear | cleavage | dress | looking_at_viewer | blush | white_gloves | solo | cape | smile | thighhighs | italian_flag | epaulettes | red_eyes | skindentation | simple_background | 1boy | black_bikini | collarbone | cowgirl_position | girl_on_top | goggles | hetero | o-ring_bikini | sex | solo_focus | wet | bikini_bottom_aside | navel | nipples | vaginal | open_mouth | outdoors | penis | pussy | sky | day | mosaic_censoring | pov | hair_between_eyes | official_alternate_costume | sidelocks | closed_mouth | huge_breasts | bare_shoulders | water | poolside | pool_ladder | thighs | whistle_around_neck | multi-strapped_bikini | mouth_hold | sitting | blue_sky | ocean | armpits | beach | covered_nipples | kneeling | white_background | paizuri | cum_on_breasts | breasts_squeezed_together | tongue_out |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------|:--------|:--------|:-----------------|:-----------|:--------|:--------------------|:--------|:---------------|:-------|:-------|:--------|:-------------|:---------------|:-------------|:-----------|:----------------|:--------------------|:-------|:---------------|:-------------|:-------------------|:--------------|:----------|:---------|:----------------|:------|:-------------|:------|:----------------------|:--------|:----------|:----------|:-------------|:-----------|:--------|:--------|:------|:------|:-------------------|:------|:--------------------|:-----------------------------|:------------|:---------------|:---------------|:-----------------|:--------|:-----------|:--------------|:---------|:----------------------|:------------------------|:-------------|:----------|:-----------|:--------|:----------|:--------|:------------------|:-----------|:-------------------|:----------|:-----------------|:----------------------------|:-------------|
| 0 | 53 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | | | X | | X | | X | X | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | | | X | | X | | X | X | | X | | X | | | | | X | | | X | X | | | X | | X | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | | | X | | X | | X | X | | X | | X | | | | | X | | | X | X | | | X | | X | | | X | | X | | | X | | | | | | | | | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 4 | 35 |  |  |  |  |  | | | X | | X | | X | X | | X | | X | | | | | X | | | X | X | | | X | | X | | | X | | X | | | | | | | | | | | | X | | | | | X | X | X | | X | X | X | X | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | | | X | | X | | X | X | | X | | X | | X | | | X | | | X | X | | | X | | X | | | X | | X | | | X | X | | | | X | | | | X | | | X | | X | | | | X | X | | | X | X | X | X | X | X | | | | | |
| 6 | 13 |  |  |  |  |  | | | X | | X | | X | X | | X | | X | | | | | X | X | | X | X | | | X | | X | | | X | | X | | | | | | | | | | | | X | X | | | X | | | | | | X | | | | | | | | | X | | | | |
| 7 | 8 |  |  |  |  |  | | | X | | X | | X | X | | | | X | | | | | | | X | X | X | | | X | X | X | | X | X | | | X | | X | | X | | | | | X | X | | X | X | X | | X | | | | | | | | | | | | | | | X | X | X | X |
|
yezhengli9/wmt20-zh-en | ---
dataset_info:
features:
- name: id (string)
dtype: string
- name: translation (translation)
dtype: string
splits:
- name: train
num_bytes: 1079773
num_examples: 2000
download_size: 555360
dataset_size: 1079773
---
# Dataset Card for "wmt20-zh-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bcmaster/virginia1 | ---
license: openrail
---
|
BEE-spoke-data/angle-UAE-pairs | ---
dataset_info:
features:
- name: text1
dtype: string
- name: text2
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 189307831.0
num_examples: 1310368
- name: validation
num_bytes: 6859317.0
num_examples: 50838
- name: test
num_bytes: 55301665.0
num_examples: 402690
download_size: 168093774
dataset_size: 251468813.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
license: odc-by
task_categories:
- sentence-similarity
- feature-extraction
language:
- en
---
# angle UAE pairs
loaded the four datasets containing pairs for [Universal AnglE Embeddings](https://github.com/SeanLee97/AnglE/tree/main/examples/UAE)
- note that `qrecc` is not included in this dataset
```
multi_nli (train set)
snli (train set)
qqp (train set)
mrpc (train set)
``` |
sajjadrauf/VQA | ---
license: afl-3.0
task_categories:
- image-segmentation
- image-classification
- question-answering
language:
- am
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
zhenzi/imagenette | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
config_name: tests
splits:
- name: train
num_bytes: 459616258
num_examples: 10500
download_size: 467583804
dataset_size: 459616258
---
|
open-llm-leaderboard/details_macadeliccc__piccolo-8x7b | ---
pretty_name: Evaluation run of macadeliccc/piccolo-8x7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [macadeliccc/piccolo-8x7b](https://huggingface.co/macadeliccc/piccolo-8x7b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__piccolo-8x7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T13:50:45.103516](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__piccolo-8x7b/blob/main/results_2024-01-21T13-50-45.103516.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6465546091601527,\n\
\ \"acc_stderr\": 0.03216541518825926,\n \"acc_norm\": 0.6461126157257634,\n\
\ \"acc_norm_stderr\": 0.03282893283116576,\n \"mc1\": 0.48225214198286415,\n\
\ \"mc1_stderr\": 0.01749247084307536,\n \"mc2\": 0.6416518638228494,\n\
\ \"mc2_stderr\": 0.015589064706165267\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6672354948805461,\n \"acc_stderr\": 0.013769863046192302,\n\
\ \"acc_norm\": 0.6962457337883959,\n \"acc_norm_stderr\": 0.013438909184778762\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6990639314877515,\n\
\ \"acc_stderr\": 0.00457727584443245,\n \"acc_norm\": 0.8698466440948018,\n\
\ \"acc_norm_stderr\": 0.003357844249123955\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941187,\n\
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941187\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461777,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461777\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025046,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025046\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993452,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993452\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39888268156424583,\n\
\ \"acc_stderr\": 0.016376966142610073,\n \"acc_norm\": 0.39888268156424583,\n\
\ \"acc_norm_stderr\": 0.016376966142610073\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n\
\ \"acc_stderr\": 0.01274085387294983,\n \"acc_norm\": 0.4661016949152542,\n\
\ \"acc_norm_stderr\": 0.01274085387294983\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.02873932851398357,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.02873932851398357\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507205,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507205\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48225214198286415,\n\
\ \"mc1_stderr\": 0.01749247084307536,\n \"mc2\": 0.6416518638228494,\n\
\ \"mc2_stderr\": 0.015589064706165267\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7987371744277821,\n \"acc_stderr\": 0.01126851997157768\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7202426080363912,\n \
\ \"acc_stderr\": 0.012364384016735319\n }\n}\n```"
repo_url: https://huggingface.co/macadeliccc/piccolo-8x7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|arc:challenge|25_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|gsm8k|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hellaswag|10_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T13-50-45.103516.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T13-50-45.103516.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- '**/details_harness|winogrande|5_2024-01-21T13-50-45.103516.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T13-50-45.103516.parquet'
- config_name: results
data_files:
- split: 2024_01_21T13_50_45.103516
path:
- results_2024-01-21T13-50-45.103516.parquet
- split: latest
path:
- results_2024-01-21T13-50-45.103516.parquet
---
# Dataset Card for Evaluation run of macadeliccc/piccolo-8x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/piccolo-8x7b](https://huggingface.co/macadeliccc/piccolo-8x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__piccolo-8x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T13:50:45.103516](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__piccolo-8x7b/blob/main/results_2024-01-21T13-50-45.103516.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6465546091601527,
"acc_stderr": 0.03216541518825926,
"acc_norm": 0.6461126157257634,
"acc_norm_stderr": 0.03282893283116576,
"mc1": 0.48225214198286415,
"mc1_stderr": 0.01749247084307536,
"mc2": 0.6416518638228494,
"mc2_stderr": 0.015589064706165267
},
"harness|arc:challenge|25": {
"acc": 0.6672354948805461,
"acc_stderr": 0.013769863046192302,
"acc_norm": 0.6962457337883959,
"acc_norm_stderr": 0.013438909184778762
},
"harness|hellaswag|10": {
"acc": 0.6990639314877515,
"acc_stderr": 0.00457727584443245,
"acc_norm": 0.8698466440948018,
"acc_norm_stderr": 0.003357844249123955
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055266,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941187,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941187
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461777,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461777
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025046,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025046
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993452,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993452
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39888268156424583,
"acc_stderr": 0.016376966142610073,
"acc_norm": 0.39888268156424583,
"acc_norm_stderr": 0.016376966142610073
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765137,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.01274085387294983,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.01274085387294983
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.02873932851398357,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.02873932851398357
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507205,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507205
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48225214198286415,
"mc1_stderr": 0.01749247084307536,
"mc2": 0.6416518638228494,
"mc2_stderr": 0.015589064706165267
},
"harness|winogrande|5": {
"acc": 0.7987371744277821,
"acc_stderr": 0.01126851997157768
},
"harness|gsm8k|5": {
"acc": 0.7202426080363912,
"acc_stderr": 0.012364384016735319
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
madabou/dkt_dataset | ---
license: mit
dataset_info:
features:
- name: image
dtype: binary
- name: text
dtype: string
splits:
- name: train
num_bytes: 1302979
num_examples: 4
download_size: 1254355
dataset_size: 1302979
---
|
NMashalov/ru_educational_book_datasets | ---
dataset_info:
features:
- name: page
dtype:
image:
decode: false
- name: label
dtype:
class_label:
names:
'0': '1001'
'1': Bek_linal
'2': Kozel-SM-Sbornik-zadach-po-obschemu-kursu-fiziki-Chast-2-Elektrichestvo-i-magnetizm-Optika
'3': Stereo_prasolov
'4': UssrVopros.ru_★_ekstremalnye_zadachi-1977
'5': algebra_10-11_kolmogorov
'6': shen-geometry
- name: reference
dtype: string
- name: base64encoding
dtype: string
- name: ocr
dtype: string
splits:
- name: train
num_bytes: 1300878591.316
num_examples: 3172
download_size: 1285364011
dataset_size: 1300878591.316
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
## Licensing Information
All rights belong to their respective authors noted in reference column. Usage of this dataset is possible only for personal purposes on a non-commercial basis.
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-121000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 662056
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_DUAL-GPO__phi-2-gpo-renew2-b0.001-i0 | ---
pretty_name: Evaluation run of DUAL-GPO/phi-2-gpo-renew2-b0.001-i0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DUAL-GPO/phi-2-gpo-renew2-b0.001-i0](https://huggingface.co/DUAL-GPO/phi-2-gpo-renew2-b0.001-i0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DUAL-GPO__phi-2-gpo-renew2-b0.001-i0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T23:00:17.682975](https://huggingface.co/datasets/open-llm-leaderboard/details_DUAL-GPO__phi-2-gpo-renew2-b0.001-i0/blob/main/results_2024-04-15T23-00-17.682975.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5818778649828326,\n\
\ \"acc_stderr\": 0.03368090663732863,\n \"acc_norm\": 0.5832483041851617,\n\
\ \"acc_norm_stderr\": 0.034365297248697683,\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5201827066772111,\n\
\ \"mc2_stderr\": 0.015652763110385215\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6092150170648464,\n \"acc_stderr\": 0.014258563880513782,\n\
\ \"acc_norm\": 0.6407849829351536,\n \"acc_norm_stderr\": 0.014020224155839155\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5853415654252141,\n\
\ \"acc_stderr\": 0.004916561213591283,\n \"acc_norm\": 0.7698665604461262,\n\
\ \"acc_norm_stderr\": 0.00420057853505653\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4603174603174603,\n \"acc_stderr\": 0.025670080636909186,\n \"\
acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.025670080636909186\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6967741935483871,\n \"acc_stderr\": 0.026148685930671742,\n \"\
acc_norm\": 0.6967741935483871,\n \"acc_norm_stderr\": 0.026148685930671742\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.029778663037752947,\n\
\ \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.029778663037752947\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.0249626835643318,\n \
\ \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.0249626835643318\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473065,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473065\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n\
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8201834862385321,\n \"acc_stderr\": 0.016465345467391545,\n \"\
acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.016465345467391545\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373617,\n \"\
acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373617\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460285,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460285\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.04010358942462203,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.04010358942462203\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.02581923325648371,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.02581923325648371\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6960408684546615,\n\
\ \"acc_stderr\": 0.016448321686769046,\n \"acc_norm\": 0.6960408684546615,\n\
\ \"acc_norm_stderr\": 0.016448321686769046\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688235,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688235\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.014893391735249627,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.014893391735249627\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302895,\n\
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302895\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.027316847674192703,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.027316847674192703\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.02662415247884585,\n\
\ \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.02662415247884585\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.423728813559322,\n\
\ \"acc_stderr\": 0.012620785155886001,\n \"acc_norm\": 0.423728813559322,\n\
\ \"acc_norm_stderr\": 0.012620785155886001\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.03034326422421352,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.03034326422421352\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5588235294117647,\n \"acc_stderr\": 0.020087362076702853,\n \
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.020087362076702853\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425464,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425464\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.02947525023601718,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.02947525023601718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5201827066772111,\n\
\ \"mc2_stderr\": 0.015652763110385215\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7379636937647988,\n \"acc_stderr\": 0.012358944431637557\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5610310841546626,\n \
\ \"acc_stderr\": 0.013669500369036195\n }\n}\n```"
repo_url: https://huggingface.co/DUAL-GPO/phi-2-gpo-renew2-b0.001-i0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|arc:challenge|25_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|gsm8k|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hellaswag|10_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T23-00-17.682975.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T23-00-17.682975.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- '**/details_harness|winogrande|5_2024-04-15T23-00-17.682975.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T23-00-17.682975.parquet'
- config_name: results
data_files:
- split: 2024_04_15T23_00_17.682975
path:
- results_2024-04-15T23-00-17.682975.parquet
- split: latest
path:
- results_2024-04-15T23-00-17.682975.parquet
---
# Dataset Card for Evaluation run of DUAL-GPO/phi-2-gpo-renew2-b0.001-i0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DUAL-GPO/phi-2-gpo-renew2-b0.001-i0](https://huggingface.co/DUAL-GPO/phi-2-gpo-renew2-b0.001-i0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DUAL-GPO__phi-2-gpo-renew2-b0.001-i0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T23:00:17.682975](https://huggingface.co/datasets/open-llm-leaderboard/details_DUAL-GPO__phi-2-gpo-renew2-b0.001-i0/blob/main/results_2024-04-15T23-00-17.682975.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5818778649828326,
"acc_stderr": 0.03368090663732863,
"acc_norm": 0.5832483041851617,
"acc_norm_stderr": 0.034365297248697683,
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5201827066772111,
"mc2_stderr": 0.015652763110385215
},
"harness|arc:challenge|25": {
"acc": 0.6092150170648464,
"acc_stderr": 0.014258563880513782,
"acc_norm": 0.6407849829351536,
"acc_norm_stderr": 0.014020224155839155
},
"harness|hellaswag|10": {
"acc": 0.5853415654252141,
"acc_stderr": 0.004916561213591283,
"acc_norm": 0.7698665604461262,
"acc_norm_stderr": 0.00420057853505653
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.025670080636909186,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.025670080636909186
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.026148685930671742,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.026148685930671742
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7823834196891192,
"acc_stderr": 0.029778663037752947,
"acc_norm": 0.7823834196891192,
"acc_norm_stderr": 0.029778663037752947
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.0249626835643318,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.0249626835643318
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473065,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473065
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.016465345467391545,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.016465345467391545
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.03213325717373617,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.03213325717373617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460285,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460285
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.02581923325648371,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.02581923325648371
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6960408684546615,
"acc_stderr": 0.016448321686769046,
"acc_norm": 0.6960408684546615,
"acc_norm_stderr": 0.016448321686769046
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688235,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688235
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249627,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249627
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.027684181883302895,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.027684181883302895
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.027316847674192703,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.027316847674192703
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.02662415247884585,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.02662415247884585
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.423728813559322,
"acc_stderr": 0.012620785155886001,
"acc_norm": 0.423728813559322,
"acc_norm_stderr": 0.012620785155886001
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.03034326422421352,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.03034326422421352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.020087362076702853,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.020087362076702853
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425464,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425464
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.02947525023601718,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.02947525023601718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5201827066772111,
"mc2_stderr": 0.015652763110385215
},
"harness|winogrande|5": {
"acc": 0.7379636937647988,
"acc_stderr": 0.012358944431637557
},
"harness|gsm8k|5": {
"acc": 0.5610310841546626,
"acc_stderr": 0.013669500369036195
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
KartonCreations/karton | ---
license: cc
---
|
CVasNLPExperiments/OxfordPets_test_google_flan_t5_xl_mode_C_A_T_SPECIFIC_ns_3669 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 984100
num_examples: 3669
download_size: 217354
dataset_size: 984100
---
# Dataset Card for "OxfordPets_test_google_flan_t5_xl_mode_C_A_T_SPECIFIC_ns_3669"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jxm/cb | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: string
- name: idx
dtype: int64
splits:
- name: train
num_bytes: 89859
num_examples: 250
- name: test
num_bytes: 93992
num_examples: 250
- name: dev
num_bytes: 22480
num_examples: 56
download_size: 139260
dataset_size: 206331
---
# Dataset Card for "cb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ethical-Lens/Tox1K | ---
license: apache-2.0
---
|
FremyCompany/OS-STS-nl-Dataset | ---
license: other
task_categories:
- sentence-similarity
language:
- nl
pretty_name: OpenSubtitles STS Dataset for Dutch
size_categories:
- 1M<n<10M
---
# OpenSubtitles STS Dataset for Dutch
OS-STS.nl is an extensive Dutch STS dataset containing over two million sentence pairs and similarity scores.
The dataset is automatically extracted from movie and documentary subtitles sourced from OpenSubtitles2018, a vast parallel corpus of aligned video subtitles.
Recognizing the high prevalence (>10%) of paraphrased statements and question-and-answer pairs in subtitled spoken language, we systematically extract the consecutive parallel sentence pairs from the subtitles that exhibit significant semantic overlap.
## Content of the dataset
The dataset contains Dutch sentence pairs, as well as semtatic similarity scores derived from their English translation derived from sentence-transformers/all-mpnet-base-v2.
<div style="max-width: 480px">

</div>
**Coming soon** |
maxolotl/must-c-en-fr-wait05_21.7 | ---
dataset_info:
features:
- name: current_source
dtype: string
- name: current_target
dtype: string
- name: target_token
dtype: string
splits:
- name: train
num_bytes: 1100095602
num_examples: 5459617
- name: test
num_bytes: 12172258
num_examples: 63342
- name: validation
num_bytes: 5754906
num_examples: 28830
download_size: 185590783
dataset_size: 1118022766
---
# Dataset Card for "must-c-en-fr-wait05_21.7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
monology/flan_plus_cot_100k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 375678160.6229266
num_examples: 98870
download_size: 88030549
dataset_size: 375678160.6229266
---
# Dataset Card for "flan_plus_cot_100k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FilledtotheBrim/test_dataset | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_pansophic__new_model_test2 | ---
pretty_name: Evaluation run of pansophic/new_model_test2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pansophic/new_model_test2](https://huggingface.co/pansophic/new_model_test2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pansophic__new_model_test2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T11:44:05.916063](https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__new_model_test2/blob/main/results_2024-03-01T11-44-05.916063.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5638420021216752,\n\
\ \"acc_stderr\": 0.03378324751322572,\n \"acc_norm\": 0.5645972727751363,\n\
\ \"acc_norm_stderr\": 0.034480506272325,\n \"mc1\": 0.3243574051407589,\n\
\ \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.4654289715654586,\n\
\ \"mc2_stderr\": 0.015388256943129463\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5836177474402731,\n \"acc_stderr\": 0.01440561827943618,\n\
\ \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974867\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.570902210714997,\n\
\ \"acc_stderr\": 0.00493935814556132,\n \"acc_norm\": 0.7536347341167098,\n\
\ \"acc_norm_stderr\": 0.0043001312233406945\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.569811320754717,\n \"acc_stderr\": 0.030471445867183238,\n\
\ \"acc_norm\": 0.569811320754717,\n \"acc_norm_stderr\": 0.030471445867183238\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.037786210790920566,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.037786210790920566\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944427,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944427\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6451612903225806,\n\
\ \"acc_stderr\": 0.027218889773308753,\n \"acc_norm\": 0.6451612903225806,\n\
\ \"acc_norm_stderr\": 0.027218889773308753\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.034819048444388045,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.034819048444388045\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.03742597043806585,\n\
\ \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.03742597043806585\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.030748905363909892,\n\
\ \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.030748905363909892\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5384615384615384,\n \"acc_stderr\": 0.025275892070240644,\n\
\ \"acc_norm\": 0.5384615384615384,\n \"acc_norm_stderr\": 0.025275892070240644\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066475,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066475\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.542016806722689,\n \"acc_stderr\": 0.032363611119519416,\n \
\ \"acc_norm\": 0.542016806722689,\n \"acc_norm_stderr\": 0.032363611119519416\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7669724770642202,\n \"acc_stderr\": 0.01812566918086149,\n \"\
acc_norm\": 0.7669724770642202,\n \"acc_norm_stderr\": 0.01812566918086149\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236435,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236435\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035286,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035286\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.03351953879521269,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.03351953879521269\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.02581923325648371,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.02581923325648371\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.698595146871009,\n\
\ \"acc_stderr\": 0.01640909109726879,\n \"acc_norm\": 0.698595146871009,\n\
\ \"acc_norm_stderr\": 0.01640909109726879\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688225,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688225\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.21452513966480447,\n\
\ \"acc_stderr\": 0.013728923407828839,\n \"acc_norm\": 0.21452513966480447,\n\
\ \"acc_norm_stderr\": 0.013728923407828839\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.028110928492809068,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.028110928492809068\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.027417996705630995,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.027417996705630995\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6234567901234568,\n \"acc_stderr\": 0.026959344518747787,\n\
\ \"acc_norm\": 0.6234567901234568,\n \"acc_norm_stderr\": 0.026959344518747787\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596157,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596157\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3924380704041721,\n\
\ \"acc_stderr\": 0.01247124366922912,\n \"acc_norm\": 0.3924380704041721,\n\
\ \"acc_norm_stderr\": 0.01247124366922912\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.03000856284500348,\n\
\ \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.03000856284500348\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5343137254901961,\n \"acc_stderr\": 0.020180144843307293,\n \
\ \"acc_norm\": 0.5343137254901961,\n \"acc_norm_stderr\": 0.020180144843307293\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.030116426296540596,\n\
\ \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.030116426296540596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824564,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824564\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3243574051407589,\n\
\ \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.4654289715654586,\n\
\ \"mc2_stderr\": 0.015388256943129463\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838229\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5322213798332069,\n \
\ \"acc_stderr\": 0.013743857303073792\n }\n}\n```"
repo_url: https://huggingface.co/pansophic/new_model_test2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|arc:challenge|25_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|gsm8k|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hellaswag|10_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T11-44-05.916063.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T11-44-05.916063.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- '**/details_harness|winogrande|5_2024-03-01T11-44-05.916063.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T11-44-05.916063.parquet'
- config_name: results
data_files:
- split: 2024_03_01T11_44_05.916063
path:
- results_2024-03-01T11-44-05.916063.parquet
- split: latest
path:
- results_2024-03-01T11-44-05.916063.parquet
---
# Dataset Card for Evaluation run of pansophic/new_model_test2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [pansophic/new_model_test2](https://huggingface.co/pansophic/new_model_test2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pansophic__new_model_test2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T11:44:05.916063](https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__new_model_test2/blob/main/results_2024-03-01T11-44-05.916063.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5638420021216752,
"acc_stderr": 0.03378324751322572,
"acc_norm": 0.5645972727751363,
"acc_norm_stderr": 0.034480506272325,
"mc1": 0.3243574051407589,
"mc1_stderr": 0.01638797677964794,
"mc2": 0.4654289715654586,
"mc2_stderr": 0.015388256943129463
},
"harness|arc:challenge|25": {
"acc": 0.5836177474402731,
"acc_stderr": 0.01440561827943618,
"acc_norm": 0.6203071672354948,
"acc_norm_stderr": 0.014182119866974867
},
"harness|hellaswag|10": {
"acc": 0.570902210714997,
"acc_stderr": 0.00493935814556132,
"acc_norm": 0.7536347341167098,
"acc_norm_stderr": 0.0043001312233406945
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.569811320754717,
"acc_stderr": 0.030471445867183238,
"acc_norm": 0.569811320754717,
"acc_norm_stderr": 0.030471445867183238
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.037786210790920566,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.037786210790920566
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944427,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944427
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6451612903225806,
"acc_stderr": 0.027218889773308753,
"acc_norm": 0.6451612903225806,
"acc_norm_stderr": 0.027218889773308753
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.034819048444388045,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.034819048444388045
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.03742597043806585,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.03742597043806585
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7616580310880829,
"acc_stderr": 0.030748905363909892,
"acc_norm": 0.7616580310880829,
"acc_norm_stderr": 0.030748905363909892
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5384615384615384,
"acc_stderr": 0.025275892070240644,
"acc_norm": 0.5384615384615384,
"acc_norm_stderr": 0.025275892070240644
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066475,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066475
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.542016806722689,
"acc_stderr": 0.032363611119519416,
"acc_norm": 0.542016806722689,
"acc_norm_stderr": 0.032363611119519416
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7669724770642202,
"acc_stderr": 0.01812566918086149,
"acc_norm": 0.7669724770642202,
"acc_norm_stderr": 0.01812566918086149
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236435,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236435
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035286,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035286
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.03351953879521269,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.03351953879521269
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.02581923325648371,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.02581923325648371
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.698595146871009,
"acc_stderr": 0.01640909109726879,
"acc_norm": 0.698595146871009,
"acc_norm_stderr": 0.01640909109726879
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688225,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688225
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.21452513966480447,
"acc_stderr": 0.013728923407828839,
"acc_norm": 0.21452513966480447,
"acc_norm_stderr": 0.013728923407828839
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.028110928492809068,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.028110928492809068
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630995,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630995
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6234567901234568,
"acc_stderr": 0.026959344518747787,
"acc_norm": 0.6234567901234568,
"acc_norm_stderr": 0.026959344518747787
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596157,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596157
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3924380704041721,
"acc_stderr": 0.01247124366922912,
"acc_norm": 0.3924380704041721,
"acc_norm_stderr": 0.01247124366922912
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.03000856284500348,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.03000856284500348
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5343137254901961,
"acc_stderr": 0.020180144843307293,
"acc_norm": 0.5343137254901961,
"acc_norm_stderr": 0.020180144843307293
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.030116426296540596,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.030116426296540596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824564,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824564
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3243574051407589,
"mc1_stderr": 0.01638797677964794,
"mc2": 0.4654289715654586,
"mc2_stderr": 0.015388256943129463
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838229
},
"harness|gsm8k|5": {
"acc": 0.5322213798332069,
"acc_stderr": 0.013743857303073792
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
BangumiBase/holonograffiti | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Holo No Graffiti
This is the image base of bangumi Holo no Graffiti, we detected 43 characters, 3244 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 135 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 96 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 91 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 10 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 14 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 112 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 111 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 59 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 76 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 95 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 86 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 98 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 83 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 116 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 32 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 142 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 136 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 99 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 97 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 11 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 127 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 78 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 63 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 26 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 66 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 117 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 89 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 88 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 16 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 79 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 74 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 10 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 92 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 89 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 114 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 12 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 10 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 14 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 48 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 37 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 18 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 107 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 171 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
TeamSODA/mcl-signal_processing_attacks_assembly_librispeech | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': 0-benign
'1': 1-kenan
'2': 2-yeehaw
'3': 3-imaginary_clipping
splits:
- name: train
num_bytes: 100461849.0
num_examples: 200
download_size: 87863935
dataset_size: 100461849.0
---
# Dataset Card for "mcl-signal_processing_attacks_assembly"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/p08_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of p08/P08/P08 (Girls' Frontline)
This is the dataset of p08/P08/P08 (Girls' Frontline), containing 20 images and their tags.
The core tags of this character are `short_hair, breasts, hat, brown_eyes, garrison_cap, medium_breasts, white_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 20.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p08_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 12.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p08_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 43 | 24.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p08_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 18.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p08_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 43 | 34.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p08_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/p08_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, looking_at_viewer, cleavage, long_sleeves, white_gloves, blue_jacket, boots, cropped_jacket, smile, thigh_strap, white_background, belt, black_leotard, blush, handgun, military_uniform, open_clothes, simple_background, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | cleavage | long_sleeves | white_gloves | blue_jacket | boots | cropped_jacket | smile | thigh_strap | white_background | belt | black_leotard | blush | handgun | military_uniform | open_clothes | simple_background | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-----------|:---------------|:---------------|:--------------|:--------|:-----------------|:--------|:--------------|:-------------------|:-------|:----------------|:--------|:----------|:-------------------|:---------------|:--------------------|:-----------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
nivos/activity_text | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1220546.4
num_examples: 10168
- name: test
num_bytes: 152568.3
num_examples: 1271
- name: valid
num_bytes: 152568.3
num_examples: 1271
download_size: 439830
dataset_size: 1525683.0
---
# Dataset Card for "activity_text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alisparda/brawl | ---
license: bigcode-openrail-m
---
|
Tarasb02/test | ---
license: creativeml-openrail-m
---
|
open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v3 | ---
pretty_name: Evaluation run of eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v3](https://huggingface.co/eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-02T17:08:29.374404](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v3/blob/main/results_2024-03-02T17-08-29.374404.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6538801740167637,\n\
\ \"acc_stderr\": 0.03205838652479307,\n \"acc_norm\": 0.6533974228406447,\n\
\ \"acc_norm_stderr\": 0.03272813802285965,\n \"mc1\": 0.6291309669522643,\n\
\ \"mc1_stderr\": 0.016909693580248835,\n \"mc2\": 0.7748431207842124,\n\
\ \"mc2_stderr\": 0.013793813723651245\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.013363080107244482,\n\
\ \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869147\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7155945030870344,\n\
\ \"acc_stderr\": 0.0045020882874701375,\n \"acc_norm\": 0.8910575582553276,\n\
\ \"acc_norm_stderr\": 0.0031093023001762094\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726855,\n \"\
acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726855\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n \"\
acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903341,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903341\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4424581005586592,\n\
\ \"acc_stderr\": 0.016611393687268584,\n \"acc_norm\": 0.4424581005586592,\n\
\ \"acc_norm_stderr\": 0.016611393687268584\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n\
\ \"acc_stderr\": 0.01275197796767601,\n \"acc_norm\": 0.47327249022164275,\n\
\ \"acc_norm_stderr\": 0.01275197796767601\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6291309669522643,\n\
\ \"mc1_stderr\": 0.016909693580248835,\n \"mc2\": 0.7748431207842124,\n\
\ \"mc2_stderr\": 0.013793813723651245\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.010099208246065597\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \
\ \"acc_stderr\": 0.012714401009923647\n }\n}\n```"
repo_url: https://huggingface.co/eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|arc:challenge|25_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|gsm8k|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hellaswag|10_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T17-08-29.374404.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T17-08-29.374404.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- '**/details_harness|winogrande|5_2024-03-02T17-08-29.374404.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-02T17-08-29.374404.parquet'
- config_name: results
data_files:
- split: 2024_03_02T17_08_29.374404
path:
- results_2024-03-02T17-08-29.374404.parquet
- split: latest
path:
- results_2024-03-02T17-08-29.374404.parquet
---
# Dataset Card for Evaluation run of eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v3](https://huggingface.co/eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-02T17:08:29.374404](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v3/blob/main/results_2024-03-02T17-08-29.374404.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6538801740167637,
"acc_stderr": 0.03205838652479307,
"acc_norm": 0.6533974228406447,
"acc_norm_stderr": 0.03272813802285965,
"mc1": 0.6291309669522643,
"mc1_stderr": 0.016909693580248835,
"mc2": 0.7748431207842124,
"mc2_stderr": 0.013793813723651245
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.013363080107244482,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869147
},
"harness|hellaswag|10": {
"acc": 0.7155945030870344,
"acc_stderr": 0.0045020882874701375,
"acc_norm": 0.8910575582553276,
"acc_norm_stderr": 0.0031093023001762094
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726855,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616323,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903341,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903341
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4424581005586592,
"acc_stderr": 0.016611393687268584,
"acc_norm": 0.4424581005586592,
"acc_norm_stderr": 0.016611393687268584
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.01275197796767601,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.01275197796767601
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6291309669522643,
"mc1_stderr": 0.016909693580248835,
"mc2": 0.7748431207842124,
"mc2_stderr": 0.013793813723651245
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.010099208246065597
},
"harness|gsm8k|5": {
"acc": 0.6921910538286581,
"acc_stderr": 0.012714401009923647
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ChavyvAkvar/chai-reward-kto-trainer-v3 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 10001346
num_examples: 5159
download_size: 5807386
dataset_size: 10001346
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
P1ayer-1/college_textbooks | ---
dataset_info:
features:
- name: authors
dtype: string
- name: color
sequence: float64
- name: depth
dtype: int64
- name: field
dtype: string
- name: id
dtype: int64
- name: match_count
dtype: int64
- name: position
sequence: float64
- name: title
dtype: string
splits:
- name: train
num_bytes: 104695663
num_examples: 637395
download_size: 59625974
dataset_size: 104695663
---
# Dataset Card for "college_textbooks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_257 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1106797912.0
num_examples: 215666
download_size: 1131982296
dataset_size: 1106797912.0
---
# Dataset Card for "chunk_257"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-squad_v2-fdec2e9c-11705559 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: deepset/xlm-roberta-large-squad2
metrics: []
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: deepset/xlm-roberta-large-squad2
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@sjrlee](https://huggingface.co/sjrlee) for evaluating this model. |
GalaktischeGurke/emails_5500_to_6000 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 12006721.928708866
num_examples: 6537
download_size: 20864531
dataset_size: 12006721.928708866
---
# Dataset Card for "emails_5500_to_6000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mrpc_existential_you_have | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 10416
num_examples: 40
- name: train
num_bytes: 18729
num_examples: 74
- name: validation
num_bytes: 3305
num_examples: 13
download_size: 32500
dataset_size: 32450
---
# Dataset Card for "MULTI_VALUE_mrpc_existential_you_have"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
srnarayanaa/opera_test | ---
license: apache-2.0
---
|
tyzhu/find_second_sent_train_10_eval_10_recite | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 46952
num_examples: 30
- name: validation
num_bytes: 15637
num_examples: 10
download_size: 0
dataset_size: 62589
---
# Dataset Card for "find_second_sent_train_10_eval_10_recite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cristibp11/common_voice_13_0_wav2vec2_preprocessed | ---
dataset_info:
config_name: es
features:
- name: input_values
sequence: float32
- name: input_length
dtype: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 28416160808
num_examples: 91374
- name: test
num_bytes: 1946938848
num_examples: 5286
download_size: 30161672462
dataset_size: 30363099656
configs:
- config_name: es
data_files:
- split: train
path: es/train-*
- split: test
path: es/test-*
license: gpl-3.0
language:
- es
pretty_name: Common Voice 13.0 - Wav2Vec2 Preprocessed
---
# Common Voice 13.0 - Wav2Vec2 Preprocessed
Basically took [Common Voice 13.0](https://huggingface.co/datasets/mozilla-foundation/common_voice_13_0), removed all languages
but English and Spanish, removed all splits but train and test, then
preprocessed data just as [this tutorial](https://huggingface.co/blog/fine-tune-xlsr-wav2vec2) for training
Wav2Vec2 model for speech-recognition. Uploaded with `push_to_hub` function.
For now, just available in Spanish. Use as follows:
```python
from datasets import load_dataset
train_ds = load_dataset("cristibp11/common_voice_13_0_wav2vec2_preprocessed", "es", split="train")
``` |
Joe02/Character_refs | ---
license: other
---
|
mstz/soybean | ---
language:
- en
tags:
- soybean
- tabular_classification
- binary_classification
- multiclass_classification
- UCI
pretty_name: Isoybean
task_categories: # Full list at https://github.com/huggingface/hub-docs/blob/main/js/src/lib/interfaces/Types.ts
- tabular-classification
configs:
- soybean
---
# Soybean
The [Soybean dataset](https://archive-beta.ics.uci.edu/dataset/90/soybean+large) from the [UCI repository](https://archive-beta.ics.uci.edu/).
Classify the type of soybean.
# Configurations and tasks
| **Configuration** | **Task** | **Description** |
|-----------------------|---------------------------|-----------------|
| soybean | Binary classification.| Classify soybean type. |
| diaporthe_stem_canker | Binary classification | Is this instance of class diaporthe_stem_canker? |
| charcoal_rot | Binary classification | Is this instance of class charcoal_rot? |
| rhizoctonia_root_rot | Binary classification | Is this instance of class rhizoctonia_root_rot? |
| phytophthora_rot | Binary classification | Is this instance of class phytophthora_rot? |
| brown_stem_rot | Binary classification | Is this instance of class brown_stem_rot? |
| powdery_mildew | Binary classification | Is this instance of class powdery_mildew? |
| downy_mildew | Binary classification | Is this instance of class downy_mildew? |
| brown_spot | Binary classification | Is this instance of class brown_spot? |
| bacterial_blight | Binary classification | Is this instance of class bacterial_blight? |
| bacterial_pustule | Binary classification | Is this instance of class bacterial_pustule? |
| purple_seed_stain | Binary classification | Is this instance of class purple_seed_stain? |
| anthracnose | Binary classification | Is this instance of class anthracnose? |
| phyllosticta_leaf_spot | Binary classification | Is this instance of class phyllosticta_leaf_spot? |
| alternarialeaf_spot | Binary classification | Is this instance of class alternarialeaf_spot? |
| frog_eye_leaf_spot | Binary classification | Is this instance of class frog_eye_leaf_spot? |
| diaporthe_pod_&_stem_blight | Binary classification | Is this instance of class diaporthe_pod_? |
| cyst_nematode | Binary classification | Is this instance of class cyst_nematode? |
| 2_4_d_injury | Binary classification | Is this instance of class 2_4_d_injury? |
| herbicide_injury | Binary classification | Is this instance of class herbicide_injury? | |
gorilla-llm/Berkeley-Function-Calling-Leaderboard | ---
license: apache-2.0
language:
- en
---
# Berkeley Function Calling Leaderboard
<!-- Provide a quick summary of the dataset. -->
The Berkeley function calling leaderboard is a live leaderboard to evaluate the ability of different LLMs to call functions (also referred to as tools).
We built this dataset from our learnings to be representative of most users' function-calling use-cases, for example, in agents, as a part of enterprise workflows, etc.
To this end, our evaluation dataset spans diverse categories, and across multiple programming languages.
Checkout the Leaderboard at [gorilla.cs.berkeley.edu/leaderboard.html](https://gorilla.cs.berkeley.edu/leaderboard.html)
and our [release blog](https://gorilla.cs.berkeley.edu/blogs/8_berkeley_function_calling_leaderboard.html)!
***Latest Version Release Date***: 4/09/2024
***Original Release Date***: 02/26/2024
### Change Log
The Berkeley Function Calling Leaderboard is a continually evolving project. We are committed to regularly updating the dataset and leaderboard by introducing new models and expanding evaluation categories. Below is an overview of the modifications implemented in the most recent version:
* [April 10, 2024] [#339](https://github.com/ShishirPatil/gorilla/pull/339): Introduce REST API sanity check for the executable test category. It ensures that all the API endpoints involved during the execution evaluation process are working properly. If any of them are not behaving as expected, the evaluation process will be stopped by default as the result will be inaccurate. Users can choose to bypass this check by setting the `--skip-api-sanity-check` flag.
* [April 9, 2024] [#338](https://github.com/ShishirPatil/gorilla/pull/338): Bug fix in the evaluation datasets (including both prompts and function docs). Bug fix for possible answers as well.
* [April 8, 2024] [#330](https://github.com/ShishirPatil/gorilla/pull/330): Fixed an oversight that was introduced in [#299](https://github.com/ShishirPatil/gorilla/pull/299). For function-calling (FC) models that cannot take `float` type in input, when the parameter type is a `float`, the evaluation procedure will convert that type to `number` in the model input and mention in the parameter description that `This is a float type value.`. An additional field `format: float` will also be included in the model input to make it clear about the type. Updated the model handler for Claude, Mistral, and OSS to better parse the model output.
* [April 3, 2024] [#309](https://github.com/ShishirPatil/gorilla/pull/309): Bug fix for evaluation dataset possible answers. Implement **string standardization** for the AST evaluation pipeline, i.e. removing white spaces and a subset of punctuations `,./-_*^` to make the AST evaluation more robust and accurate. Fixed AST evaluation issue for type `tuple`. Add 2 new models `meetkai/functionary-small-v2.4 (FC)`, `meetkai/functionary-medium-v2.4 (FC)` to the leaderboard.
* [April 1, 2024] [#299](https://github.com/ShishirPatil/gorilla/pull/299): Leaderboard update with new models (`Claude-3-Haiku`, `Databrick-DBRX-Instruct`), more advanced AST evaluation procedure, and updated evaluation datasets. Cost and latency statistics during evaluation are also measured. We also released the manual that our evaluation procedure is based on, available [here](https://gorilla.cs.berkeley.edu/blogs/8_berkeley_function_calling_leaderboard.html#metrics).
* [Mar 11, 2024] [#254](https://github.com/ShishirPatil/gorilla/pull/254): Leaderboard update with 3 new models: `Claude-3-Opus-20240229 (Prompt)`, `Claude-3-Sonnet-20240229 (Prompt)`, and `meetkai/functionary-medium-v2.2 (FC)`
* [Mar 5, 2024] [#237](https://github.com/ShishirPatil/gorilla/pull/237) and [238](https://github.com/ShishirPatil/gorilla/pull/238): leaderboard update resulting from [#223](https://github.com/ShishirPatil/gorilla/pull/223); 3 new models: `mistral-large-2402`, `gemini-1.0-pro`, and `gemma`.
* [Feb 29, 2024] [#223](https://github.com/ShishirPatil/gorilla/pull/223): Modifications to REST evaluation.
* [Feb 27, 2024] [#215](https://github.com/ShishirPatil/gorilla/pull/215): BFCL first release.
## Dataset Composition

| # | Category |
|---|----------|
|200 | Chatting Capability|
|100 | Simple (Exec)|
|50 | Multiple (Exec)|
|50 | Parallel (Exec)|
|40 | Parallel & Multiple (Exec)|
|400 | Simple (AST)|
|200 | Multiple (AST)|
|200 | Parallel (AST)|
|200 | Parallel & Multiple (AST)|
|240 | Relevance|
|70 | REST|
|100 | Java|
|100 | SQL|
|50 | Javascript|
### Dataset Description
We break down the majority of the evaluation into two categories:
- **Python**: Simple Function, Multiple Function, Parallel Function, Parallel Multiple Function
- **Non-Python**: Chatting Capability, Function Relevance Detection, REST API, SQL, Java, Javascript
#### Python
**Simple**: Single function evaluation contains the simplest but most commonly seen format, where the user supplies a single JSON function document, with one and only one function call being invoked.
**Multiple Function**: Multiple function category contains a user question that only invokes one function call out of 2 to 4 JSON function documentations. The model needs to be capable of selecting the best function to invoke according to user-provided context.
**Parallel Function**: Parallel function is defined as invoking multiple function calls in parallel with one user query. The model needs to digest how many function calls need to be made and the question to model can be a single sentence or multiple sentence.
**Parallel Multiple Function**: Parallel Multiple function is the combination of parallel function and multiple function. In other words, the model is provided with multiple function documentation, and each of the corresponding function calls will be invoked zero or more times.
Each category has both AST and its corresponding executable evaluations. In the executable evaluation data, we manually write Python functions drawing inspiration from free REST API endpoints (e.g. get weather) and functions (e.g. linear regression) that compute directly. The executable category is designed to understand whether the function call generation is able to be stably utilized in applications utilizing function calls in the real world.
#### Non-Python Evaluation
While the previous categories consist of the majority of our evaluations, we include other specific categories, namely Chatting Capability, Function Relevance Detection, REST API, SQL, Java, and JavaScript, to evaluate model performance on diverse scenarios and support of multiple programming languages, and are resilient to irrelevant questions and function documentations.
**Chatting Capability**: In Chatting Capability, we design scenarios where no functions are passed in, and the users ask generic questions - this is similar to using the model as a general-purpose chatbot. We evaluate if the model is able to output chat messages and recognize that it does not need to invoke any functions. Note the difference with “Relevance” where the model is expected to also evaluate if any of the function inputs are relevant or not. We include this category for internal model evaluation and exclude the statistics from the live leaderboard. We currently are working on a better evaluation of chat ability and ensuring the chat is relevant and coherent with users' requests and open to suggestions and feedback from the community.
**Function Relevance Detection**: In function relevance detection, we design scenarios where none of the provided functions are relevant and supposed to be invoked. We expect the model's output to be a non-function-call response. This scenario provides insight into whether a model will hallucinate on its functions and parameters to generate function code despite lacking the function information or instructions from the users to do so.
**REST API**: A majority of the real-world API calls are from REST API calls. Python mainly makes REST API calls through `requests.get()`, `requests.post()`, `requests.delete()`, etc that are included in the Python requests library. `GET` requests are the most common ones used in the real world. As a result, we include real-world `GET` requests to test the model's capabilities to generate executable REST API calls through complex function documentation, using `requests.get()` along with the API's hardcoded URL and description of the purpose of the function and its parameters. Our evaluation includes two variations. The first type requires passing the parameters inside the URL, called path parameters, for example, the `{Year}` and `{CountryCode}` in `GET` `/api/v3/PublicHolidays/{Year}/{CountryCode}`. The second type requires the model to put parameters as key/value pairs into the params and/or headers of `requests.get(.)`. For example, `params={'lang': 'fr'}` in the function call. The model is not given which type of REST API call it is going to make but needs to make a decision on how it's going to be invoked.
For REST API, we use an executable evaluation to check for the executable outputs' effective execution, response type, and response JSON key consistencies. On the AST, we chose not to perform AST evaluation on REST mainly because of the immense number of possible answers; the enumeration of all possible answers is exhaustive for complicated defined APIs.
**SQL**: SQL evaluation data includes our customized `sql.execute` functions that contain sql_keyword, table_name, columns, and conditions. Those four parameters provide the necessary information to construct a simple SQL query like `SELECT column_A from table_B where column_C == D` Through this, we want to see if through function calling, SQL query can be reliably constructed and utilized rather than training a SQL-specific model. In our evaluation dataset, we restricted the scenarios and supported simple keywords, including `SELECT`, `INSERT INTO`, `UPDATE`, `DELETE`, and `CREATE`. We included 100 examples for SQL AST evaluation. Note that SQL AST evaluation will not be shown in our leaderboard calculations. We use SQL evaluation to test the generalization ability of function calling for programming languages that are not included in the training set for Gorilla OpenFunctions-v2. We opted to exclude SQL performance from the AST evaluation in the BFCL due to the multiplicity of methods to construct SQL function calls achieving identical outcomes. We're currently working on a better evaluation of SQL and are open to suggestions and feedback from the community. Therefore, SQL has been omitted from the current leaderboard to pave the way for a more comprehensive evaluation in subsequent iterations.
**Java and Javascript**: Despite function calling formats being the same across most programming languages, each programming language has language-specific types. For example, Java has the `HashMap` type. The goal of this test category is to understand how well the function calling model can be extended to not just Python type but all the language-specific typings. We included 100 examples for Java AST evaluation and 70 examples for Javascript AST evaluation.
The categories outlined above provide insight into the performance of different models across popular API call scenarios, offering valuable perspectives on the potential of function-calling models.
### Evaluation
This dataset serves as the question + function documentation pairs for Berkeley Function-Calling Leaderboard (BFCL) evaluation. The source code for the evaluation process can be found [here](https://github.com/ShishirPatil/gorilla/tree/main/berkeley-function-call-leaderboard) with detailed instructions on how to use this dataset to compare LLM tool use capabilities across different models and categories.
More details on evaluation metrics, i.e. rules for the Abstract Syntax Tree (AST) and executable evaluation can be found in the [release blog](https://gorilla.cs.berkeley.edu/blogs/8_berkeley_function_calling_leaderboard.html#metrics).
### Contributing
All the models, and data used to train the models are released under Apache 2.0.
Gorilla is an open source effort from UC Berkeley and we welcome contributors.
Please email us your comments, criticisms, and questions.
More information about the project can be found at https://gorilla.cs.berkeley.edu/
### BibTex
```bibtex
@misc{berkeley-function-calling-leaderboard,
title={Berkeley Function Calling Leaderboard},
author={Fanjia Yan and Huanzhi Mao and Charlie Cheng-Jie Ji and Tianjun Zhang and Shishir G. Patil and Ion Stoica and Joseph E. Gonzalez},
howpublished={\url{https://gorilla.cs.berkeley.edu/blogs/8_berkeley_function_calling_leaderboard.html}},
year={2024},
}
```
|
monet-joe/cv_backbones | ---
license: mit
task_categories:
- image-classification
- feature-extraction
language:
- en
tags:
- code
pretty_name: Vi-Backbones
size_categories:
- n<1K
viewer: false
---
# Dataset Card for "monet-joe/cv_backbones"
This repository consolidates the collection of backbone networks for pre-trained computer vision models available on the PyTorch official website. It mainly includes various Convolutional Neural Networks (CNNs) and Vision Transformer models pre-trained on the ImageNet1K dataset. The entire collection is divided into two subsets, V1 and V2, encompassing multiple classic and advanced versions of visual models. These pre-trained backbone networks provide users with a robust foundation for transfer learning in tasks such as image recognition, object detection, and image segmentation. Simultaneously, it offers a convenient choice for researchers and practitioners to flexibly apply these pre-trained models in different scenarios.
## Viewer
<https://huggingface.co/spaces/monet-joe/cv-backbones>
### Data Fields
| ver | type | input_size | url |
| :-----------: | :-----------: | :--------------: | :-------------------------------: |
| backbone name | backbone type | input image size | url of pretrained model .pth file |
### Splits
| subsets |
| :--: |
| IMAGENET1K_V1 |
| IMAGENET1K_V2 |
## Maintenance
```bash
git clone git@hf.co:datasets/monet-joe/cv_backbones
```
## Usage
```python
from datasets import load_dataset
backbones = load_dataset("monet-joe/cv_backbones")
for weights in backbones["IMAGENET1K_V1"]:
print(weights)
for weights in backbones["IMAGENET1K_V2"]:
print(weights)
```
## Param count
### IMAGENET1K_V1
| Backbone | Params(M) |
| :----------------: | :-------: |
| SqueezeNet1_0 | 1.2 |
| SqueezeNet1_1 | 1.2 |
| ShuffleNet_V2_X0_5 | 1.4 |
| MNASNet0_5 | 2.2 |
| ShuffleNet_V2_X1_0 | 2.3 |
| MobileNet_V3_Small | 2.5 |
| MNASNet0_75 | 3.2 |
| MobileNet_V2 | 3.5 |
| ShuffleNet_V2_X1_5 | 3.5 |
| RegNet_Y_400MF | 4.3 |
| MNASNet1_0 | 4.4 |
| EfficientNet_B0 | 5.3 |
| MobileNet_V3_Large | 5.5 |
| RegNet_X_400MF | 5.5 |
| MNASNet1_3 | 6.3 |
| RegNet_Y_800MF | 6.4 |
| GoogLeNet | 6.6 |
| RegNet_X_800MF | 7.3 |
| ShuffleNet_V2_X2_0 | 7.4 |
| EfficientNet_B1 | 7.8 |
| DenseNet121 | 8 |
| EfficientNet_B2 | 9.1 |
| RegNet_X_1_6GF | 9.2 |
| RegNet_Y_1_6GF | 11.2 |
| ResNet18 | 11.7 |
| EfficientNet_B3 | 12.2 |
| DenseNet169 | 14.1 |
| RegNet_X_3_2GF | 15.3 |
| EfficientNet_B4 | 19.3 |
| RegNet_Y_3_2GF | 19.4 |
| DenseNet201 | 20 |
| EfficientNet_V2_S | 21.5 |
| ResNet34 | 21.8 |
| ResNeXt50_32X4D | 25 |
| ResNet50 | 25.6 |
| Inception_V3 | 27.2 |
| Swin_T | 28.3 |
| Swin_V2_T | 28.4 |
| ConvNeXt_Tiny | 28.6 |
| DenseNet161 | 28.7 |
| EfficientNet_B5 | 30.4 |
| MaxVit_T | 30.9 |
| RegNet_Y_8GF | 39.4 |
| RegNet_X_8GF | 39.6 |
| EfficientNet_B6 | 43 |
| ResNet101 | 44.5 |
| Swin_S | 49.6 |
| Swin_V2_S | 49.7 |
| ConvNeXt_Small | 50.2 |
| EfficientNet_V2_M | 54.1 |
| RegNet_X_16GF | 54.3 |
| ResNet152 | 60.2 |
| AlexNet | 61.1 |
| EfficientNet_B7 | 66.3 |
| Wide_ResNet50_2 | 68.9 |
| ResNeXt101_64X4D | 83.5 |
| RegNet_Y_16GF | 83.6 |
| ViT_B_16 | 86.6 |
| Swin_B | 87.8 |
| Swin_V2_B | 87.9 |
| ViT_B_32 | 88.2 |
| ConvNeXt_Base | 88.6 |
| ResNeXt101_32X8D | 88.8 |
| RegNet_X_32GF | 107.8 |
| EfficientNet_V2_L | 118.5 |
| Wide_ResNet101_2 | 126.9 |
| VGG11_BN | 132.9 |
| VGG11 | 132.9 |
| VGG13 | 133 |
| VGG13_BN | 133.1 |
| VGG16_BN | 138.4 |
| VGG16 | 138.4 |
| VGG19_BN | 143.7 |
| VGG19 | 143.7 |
| RegNet_Y_32GF | 145 |
| ConvNeXt_Large | 197.8 |
| ViT_L_16 | 304.3 |
| ViT_L_32 | 306.5 |
### IMAGENET1K_V2
| Backbone | Params(M) |
| :----------------: | :-------: |
| MobileNet_V2 | 3.5 |
| RegNet_Y_400MF | 4.3 |
| MobileNet_V3_Large | 5.5 |
| RegNet_X_400MF | 5.5 |
| RegNet_Y_800MF | 6.4 |
| RegNet_X_800MF | 7.3 |
| EfficientNet_B1 | 7.8 |
| RegNet_X_1_6GF | 9.2 |
| RegNet_Y_1_6GF | 11.2 |
| RegNet_X_3_2GF | 15.3 |
| RegNet_Y_3_2GF | 19.4 |
| ResNeXt50_32X4D | 25 |
| ResNet50 | 25.6 |
| RegNet_Y_8GF | 39.4 |
| RegNet_X_8GF | 39.6 |
| ResNet101 | 44.5 |
| RegNet_X_16GF | 54.3 |
| ResNet152 | 60.2 |
| Wide_ResNet50_2 | 68.9 |
| RegNet_Y_16GF | 83.6 |
| ResNeXt101_32X8D | 88.8 |
| RegNet_X_32GF | 107.8 |
| Wide_ResNet101_2 | 126.9 |
| RegNet_Y_32GF | 145 |
## Mirror
<https://www.modelscope.cn/datasets/monetjoe/cv_backbones>
## Reference
[1] <https://pytorch.org/vision/main/_modules><br>
[2] <https://pytorch.org/vision/main/models.html> |
vunhucuongit/handwriting | ---
license: apache-2.0
---
|
chainyo/rvl-cdip-invoice | ---
license: other
---
⚠️ This only a subpart of the original dataset, containing only `invoice`.
The RVL-CDIP (Ryerson Vision Lab Complex Document Information Processing) dataset consists of 400,000 grayscale images in 16 classes, with 25,000 images per class. There are 320,000 training images, 40,000 validation images, and 40,000 test images. The images are sized so their largest dimension does not exceed 1000 pixels.
For questions and comments please contact Adam Harley (aharley@scs.ryerson.ca).
The full dataset can be found [here](https://www.cs.cmu.edu/~aharley/rvl-cdip/).
## Labels
0: letter
1: form
2: email
3: handwritten
4: advertissement
5: scientific report
6: scientific publication
7: specification
8: file folder
9: news article
10: budget
11: invoice
12: presentation
13: questionnaire
14: resume
15: memo
## Citation
This dataset is from this [paper](https://www.cs.cmu.edu/~aharley/icdar15/) `A. W. Harley, A. Ufkes, K. G. Derpanis, "Evaluation of Deep Convolutional Nets for Document Image Classification and Retrieval," in ICDAR, 2015`
## License
RVL-CDIP is a subset of IIT-CDIP, which came from the [Legacy Tobacco Document Library](https://www.industrydocuments.ucsf.edu/tobacco/), for which license information can be found [here](https://www.industrydocuments.ucsf.edu/help/copyright/).
## References
1. D. Lewis, G. Agam, S. Argamon, O. Frieder, D. Grossman, and J. Heard, "Building a test collection for complex document information processing," in Proc. 29th Annual Int. ACM SIGIR Conference (SIGIR 2006), pp. 665-666, 2006
2. The Legacy Tobacco Document Library (LTDL), University of California, San Francisco, 2007. http://legacy.library.ucsf.edu/. |
CyberHarem/katarina_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of katarina (Fire Emblem)
This is the dataset of katarina (Fire Emblem), containing 96 images and their tags.
The core tags of this character are `purple_hair, short_hair, breasts, grey_eyes, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 96 | 127.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katarina_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 96 | 72.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katarina_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 213 | 143.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katarina_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 96 | 112.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katarina_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 213 | 204.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katarina_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/katarina_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, nipples, blush, hetero, open_mouth, penis, 1boy, sex, solo_focus, cum_in_pussy, nude, thighhighs, vaginal, medium_breasts, spread_legs, bar_censor, large_breasts, lying, scarf, sweat |
| 1 | 5 |  |  |  |  |  | 1girl, blush, large_breasts, looking_at_viewer, solo, white_background, nipples, simple_background, nude, ass, collarbone, looking_back, navel, smile |
| 2 | 8 |  |  |  |  |  | 1girl, solo, looking_at_viewer, open_mouth, simple_background, red_scarf, upper_body, short_sleeves, smile, blush, white_background, dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | nipples | blush | hetero | open_mouth | penis | 1boy | sex | solo_focus | cum_in_pussy | nude | thighhighs | vaginal | medium_breasts | spread_legs | bar_censor | large_breasts | lying | scarf | sweat | looking_at_viewer | solo | white_background | simple_background | ass | collarbone | looking_back | navel | smile | red_scarf | upper_body | short_sleeves | dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:--------|:---------|:-------------|:--------|:-------|:------|:-------------|:---------------|:-------|:-------------|:----------|:-----------------|:--------------|:-------------|:----------------|:--------|:--------|:--------|:--------------------|:-------|:-------------------|:--------------------|:------|:-------------|:---------------|:--------|:--------|:------------|:-------------|:----------------|:--------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | | | | | | | | X | | | | | | X | | | | X | X | X | X | X | X | X | X | X | | | | |
| 2 | 8 |  |  |  |  |  | X | | X | | X | | | | | | | | | | | | | | | | X | X | X | X | | | | | X | X | X | X | X |
|
anan-2024/twitter_dataset_1713059314 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 20482
num_examples: 46
download_size: 11206
dataset_size: 20482
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
generative-newsai/news-unmasked | ---
dataset_info:
features:
- name: image
dtype: image
- name: section
dtype: string
- name: headline
dtype: string
- name: image_id
dtype: string
splits:
- name: train
num_bytes: 5084636867.984
num_examples: 48988
- name: test
num_bytes: 1360809852.398
num_examples: 12247
download_size: 1331950856
dataset_size: 6445446720.382
task_categories:
- image-to-text
pretty_name: NewsUnmasked
---
# Dataset Card for "news-unmasked"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CATIE-AQ/squad_v2_french_translated_fr_prompt_context_generation_with_answer_and_question | ---
language:
- fr
license: apache-2.0
size_categories:
- 1M<n<10M
task_categories:
- text-generation
tags:
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- squad_v2_french_translated
---
# squad_v2_french_translated_fr_prompt_context_generation_with_answer_and_question
## Summary
**squad_v2_french_translated_fr_prompt_context_generation_with_answer_and_question** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **1,271,928** rows that can be used for a context-generation (with answer and question) task.
The original data (without prompts) comes from the dataset [pragnakalp/squad_v2_french_translated](https://huggingface.co/datasets/pragnakalp/squad_v2_french_translated) and was augmented by questions in SQUAD 2.0 format in the [FrenchQA]( https://huggingface.co/datasets/CATIE-AQ/frenchQA) dataset.
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
24 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'Étant donné la réponse "'+ answer+'" à la question "'+question+'", écrire un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'" à la question "'+question+'", écris un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'" à la question "'+question+'", écrivez un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'" à la question "'+question+'", rédiger un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'" à la question "'+question+'", rédige un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'" à la question "'+question+'", rédigez un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'" à la question "'+question+'", générer un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'" à la question "'+question+'", génère un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'" à la question "'+question+'", générez un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'" à la question "'+question+'", créer un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'" à la question "'+question+'", crée un texte explicatif.\nTexte : ',
'Étant donné la réponse "'+ answer+'" à la question "'+question+'", créez un texte explicatif.\nTexte : ',
'Ecrire un texte comme contexte de la réponse "'+ answer+'" à la question "'+question+'" \nTexte : ',
'Ecris un texte comme contexte de la réponse "'+ answer+'" à la question "'+question+'" \nTexte : ',
'Ecrivez un texte comme contexte de la réponse "'+ answer+'" à la question "'+question+'" \nTexte : ',
'Rédiger un texte comme contexte de la réponse "'+ answer+'" à la question "'+question+'" \nTexte : ',
'Rédige un texte comme contexte de la réponse "'+ answer+'" à la question "'+question+'" \nTexte : ',
'Rédigez un texte comme contexte de la réponse "'+ answer+'" à la question "'+question+'" \nTexte : ',
'Générer un texte comme contexte de la réponse "'+ answer+'" à la question "'+question+'" \nTexte : ',
'Génère un texte comme contexte de la réponse "'+ answer+'" à la question "'+question+'" \nTexte : ',
'Générez un texte comme contexte de la réponse "'+ answer+'" à la question "'+question+'" \nTexte : ',
'Créer un texte comme contexte de la réponse "'+ answer+'" à la question "'+question+'" \nTexte : ',
'Crée un texte comme contexte de la réponse "'+ answer+'" à la question "'+question+'" \nTexte : ',
'Créez un texte comme contexte de la réponse "'+ answer+'" à la question "'+question+'" \nTexte : '
```
# Splits
- `train` with 442,752 samples
- no `valid` split
- no `test` split
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/squad_v2_french_translated_fr_prompt_context_generation_with_answer_and_question")
```
# Citation
## Original data
> Hugging Face repository: https://huggingface.co/datasets/pragnakalp/squad_v2_french_translated
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
apache-2.0 |
open-llm-leaderboard/details_shuvom__yuj-v1 | ---
pretty_name: Evaluation run of shuvom/yuj-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [shuvom/yuj-v1](https://huggingface.co/shuvom/yuj-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shuvom__yuj-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-13T14:45:38.399135](https://huggingface.co/datasets/open-llm-leaderboard/details_shuvom__yuj-v1/blob/main/results_2024-02-13T14-45-38.399135.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4368690144541721,\n\
\ \"acc_stderr\": 0.03434467285481173,\n \"acc_norm\": 0.44258872704716845,\n\
\ \"acc_norm_stderr\": 0.03518630942288087,\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476194,\n \"mc2\": 0.4169072304332008,\n\
\ \"mc2_stderr\": 0.015264102154015026\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.41638225255972694,\n \"acc_stderr\": 0.014405618279436172,\n\
\ \"acc_norm\": 0.4564846416382253,\n \"acc_norm_stderr\": 0.014555949760496437\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5310695080661223,\n\
\ \"acc_stderr\": 0.004980138679161042,\n \"acc_norm\": 0.700955984863573,\n\
\ \"acc_norm_stderr\": 0.004569034613332594\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.04256193767901407,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.04256193767901407\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.45394736842105265,\n \"acc_stderr\": 0.04051646342874143,\n\
\ \"acc_norm\": 0.45394736842105265,\n \"acc_norm_stderr\": 0.04051646342874143\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.47547169811320755,\n \"acc_stderr\": 0.030735822206205615,\n\
\ \"acc_norm\": 0.47547169811320755,\n \"acc_norm_stderr\": 0.030735822206205615\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4305555555555556,\n\
\ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.4305555555555556,\n\
\ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3468208092485549,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.3468208092485549,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237657,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237657\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.031907012423268113,\n\
\ \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.031907012423268113\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.023695415009463087,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.023695415009463087\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4870967741935484,\n\
\ \"acc_stderr\": 0.028434533152681855,\n \"acc_norm\": 0.4870967741935484,\n\
\ \"acc_norm_stderr\": 0.028434533152681855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.03144712581678242,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.03144712581678242\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.03843566993588717,\n\
\ \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.03843566993588717\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5252525252525253,\n \"acc_stderr\": 0.03557806245087314,\n \"\
acc_norm\": 0.5252525252525253,\n \"acc_norm_stderr\": 0.03557806245087314\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6321243523316062,\n \"acc_stderr\": 0.034801756684660366,\n\
\ \"acc_norm\": 0.6321243523316062,\n \"acc_norm_stderr\": 0.034801756684660366\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4230769230769231,\n \"acc_stderr\": 0.02504919787604234,\n \
\ \"acc_norm\": 0.4230769230769231,\n \"acc_norm_stderr\": 0.02504919787604234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.03191863374478465,\n\
\ \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.03191863374478465\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6201834862385321,\n \"acc_stderr\": 0.020808825617866244,\n \"\
acc_norm\": 0.6201834862385321,\n \"acc_norm_stderr\": 0.020808825617866244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3287037037037037,\n \"acc_stderr\": 0.03203614084670058,\n \"\
acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.03203614084670058\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5735294117647058,\n \"acc_stderr\": 0.03471157907953427,\n \"\
acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03471157907953427\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6329113924050633,\n \"acc_stderr\": 0.031376240725616185,\n \
\ \"acc_norm\": 0.6329113924050633,\n \"acc_norm_stderr\": 0.031376240725616185\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n\
\ \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.5246636771300448,\n\
\ \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48091603053435117,\n \"acc_stderr\": 0.04382094705550988,\n\
\ \"acc_norm\": 0.48091603053435117,\n \"acc_norm_stderr\": 0.04382094705550988\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4214876033057851,\n \"acc_stderr\": 0.04507732278775094,\n \"\
acc_norm\": 0.4214876033057851,\n \"acc_norm_stderr\": 0.04507732278775094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04833682445228318,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04833682445228318\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.50920245398773,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.50920245398773,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7008547008547008,\n\
\ \"acc_stderr\": 0.02999695185834948,\n \"acc_norm\": 0.7008547008547008,\n\
\ \"acc_norm_stderr\": 0.02999695185834948\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5644955300127714,\n\
\ \"acc_stderr\": 0.01773058992792659,\n \"acc_norm\": 0.5644955300127714,\n\
\ \"acc_norm_stderr\": 0.01773058992792659\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.026842985519615375,\n\
\ \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.026842985519615375\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2569832402234637,\n\
\ \"acc_stderr\": 0.014614465821966353,\n \"acc_norm\": 0.2569832402234637,\n\
\ \"acc_norm_stderr\": 0.014614465821966353\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.40522875816993464,\n \"acc_stderr\": 0.028110928492809068,\n\
\ \"acc_norm\": 0.40522875816993464,\n \"acc_norm_stderr\": 0.028110928492809068\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4919614147909968,\n\
\ \"acc_stderr\": 0.028394421370984548,\n \"acc_norm\": 0.4919614147909968,\n\
\ \"acc_norm_stderr\": 0.028394421370984548\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.02774431344337654,\n\
\ \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.02774431344337654\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3120567375886525,\n \"acc_stderr\": 0.027640120545169934,\n \
\ \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.027640120545169934\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3604954367666232,\n\
\ \"acc_stderr\": 0.012263110237299233,\n \"acc_norm\": 0.3604954367666232,\n\
\ \"acc_norm_stderr\": 0.012263110237299233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3713235294117647,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.3713235294117647,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.42810457516339867,\n \"acc_stderr\": 0.0200176292142131,\n \
\ \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.0200176292142131\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.031680911612338825,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.031680911612338825\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5572139303482587,\n\
\ \"acc_stderr\": 0.03512310964123936,\n \"acc_norm\": 0.5572139303482587,\n\
\ \"acc_norm_stderr\": 0.03512310964123936\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n\
\ \"acc_stderr\": 0.03777798822748018,\n \"acc_norm\": 0.3795180722891566,\n\
\ \"acc_norm_stderr\": 0.03777798822748018\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5847953216374269,\n \"acc_stderr\": 0.03779275945503201,\n\
\ \"acc_norm\": 0.5847953216374269,\n \"acc_norm_stderr\": 0.03779275945503201\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476194,\n \"mc2\": 0.4169072304332008,\n\
\ \"mc2_stderr\": 0.015264102154015026\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6985003946329913,\n \"acc_stderr\": 0.012897628072546683\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.047763457164518575,\n \
\ \"acc_stderr\": 0.005874387536229317\n }\n}\n```"
repo_url: https://huggingface.co/shuvom/yuj-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|arc:challenge|25_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|gsm8k|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hellaswag|10_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T14-45-38.399135.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T14-45-38.399135.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- '**/details_harness|winogrande|5_2024-02-13T14-45-38.399135.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-13T14-45-38.399135.parquet'
- config_name: results
data_files:
- split: 2024_02_13T14_45_38.399135
path:
- results_2024-02-13T14-45-38.399135.parquet
- split: latest
path:
- results_2024-02-13T14-45-38.399135.parquet
---
# Dataset Card for Evaluation run of shuvom/yuj-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [shuvom/yuj-v1](https://huggingface.co/shuvom/yuj-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_shuvom__yuj-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T14:45:38.399135](https://huggingface.co/datasets/open-llm-leaderboard/details_shuvom__yuj-v1/blob/main/results_2024-02-13T14-45-38.399135.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4368690144541721,
"acc_stderr": 0.03434467285481173,
"acc_norm": 0.44258872704716845,
"acc_norm_stderr": 0.03518630942288087,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476194,
"mc2": 0.4169072304332008,
"mc2_stderr": 0.015264102154015026
},
"harness|arc:challenge|25": {
"acc": 0.41638225255972694,
"acc_stderr": 0.014405618279436172,
"acc_norm": 0.4564846416382253,
"acc_norm_stderr": 0.014555949760496437
},
"harness|hellaswag|10": {
"acc": 0.5310695080661223,
"acc_stderr": 0.004980138679161042,
"acc_norm": 0.700955984863573,
"acc_norm_stderr": 0.004569034613332594
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.04256193767901407,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.04256193767901407
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.45394736842105265,
"acc_stderr": 0.04051646342874143,
"acc_norm": 0.45394736842105265,
"acc_norm_stderr": 0.04051646342874143
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.47547169811320755,
"acc_stderr": 0.030735822206205615,
"acc_norm": 0.47547169811320755,
"acc_norm_stderr": 0.030735822206205615
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237657,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237657
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39148936170212767,
"acc_stderr": 0.031907012423268113,
"acc_norm": 0.39148936170212767,
"acc_norm_stderr": 0.031907012423268113
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.023695415009463087,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.023695415009463087
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4870967741935484,
"acc_stderr": 0.028434533152681855,
"acc_norm": 0.4870967741935484,
"acc_norm_stderr": 0.028434533152681855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.03144712581678242,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.03144712581678242
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5878787878787879,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.5878787878787879,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5252525252525253,
"acc_stderr": 0.03557806245087314,
"acc_norm": 0.5252525252525253,
"acc_norm_stderr": 0.03557806245087314
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6321243523316062,
"acc_stderr": 0.034801756684660366,
"acc_norm": 0.6321243523316062,
"acc_norm_stderr": 0.034801756684660366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.02504919787604234,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.02504919787604234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40756302521008403,
"acc_stderr": 0.03191863374478465,
"acc_norm": 0.40756302521008403,
"acc_norm_stderr": 0.03191863374478465
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6201834862385321,
"acc_stderr": 0.020808825617866244,
"acc_norm": 0.6201834862385321,
"acc_norm_stderr": 0.020808825617866244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.03203614084670058,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.03203614084670058
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03471157907953427,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03471157907953427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6329113924050633,
"acc_stderr": 0.031376240725616185,
"acc_norm": 0.6329113924050633,
"acc_norm_stderr": 0.031376240725616185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5246636771300448,
"acc_stderr": 0.03351695167652628,
"acc_norm": 0.5246636771300448,
"acc_norm_stderr": 0.03351695167652628
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48091603053435117,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.48091603053435117,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4214876033057851,
"acc_stderr": 0.04507732278775094,
"acc_norm": 0.4214876033057851,
"acc_norm_stderr": 0.04507732278775094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5,
"acc_stderr": 0.04833682445228318,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04833682445228318
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.50920245398773,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.50920245398773,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7008547008547008,
"acc_stderr": 0.02999695185834948,
"acc_norm": 0.7008547008547008,
"acc_norm_stderr": 0.02999695185834948
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5644955300127714,
"acc_stderr": 0.01773058992792659,
"acc_norm": 0.5644955300127714,
"acc_norm_stderr": 0.01773058992792659
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2569832402234637,
"acc_stderr": 0.014614465821966353,
"acc_norm": 0.2569832402234637,
"acc_norm_stderr": 0.014614465821966353
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.40522875816993464,
"acc_stderr": 0.028110928492809068,
"acc_norm": 0.40522875816993464,
"acc_norm_stderr": 0.028110928492809068
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4919614147909968,
"acc_stderr": 0.028394421370984548,
"acc_norm": 0.4919614147909968,
"acc_norm_stderr": 0.028394421370984548
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.02774431344337654,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.02774431344337654
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.027640120545169934,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.027640120545169934
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3604954367666232,
"acc_stderr": 0.012263110237299233,
"acc_norm": 0.3604954367666232,
"acc_norm_stderr": 0.012263110237299233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3713235294117647,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.3713235294117647,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.0200176292142131,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.0200176292142131
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5572139303482587,
"acc_stderr": 0.03512310964123936,
"acc_norm": 0.5572139303482587,
"acc_norm_stderr": 0.03512310964123936
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.03777798822748018,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.03777798822748018
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5847953216374269,
"acc_stderr": 0.03779275945503201,
"acc_norm": 0.5847953216374269,
"acc_norm_stderr": 0.03779275945503201
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476194,
"mc2": 0.4169072304332008,
"mc2_stderr": 0.015264102154015026
},
"harness|winogrande|5": {
"acc": 0.6985003946329913,
"acc_stderr": 0.012897628072546683
},
"harness|gsm8k|5": {
"acc": 0.047763457164518575,
"acc_stderr": 0.005874387536229317
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Josephisdumb/Josephisdumb | ---
license: apache-2.0
---
|
yongchoooon/fire-test | ---
license: cc-by-nc-sa-4.0
annotations_creators:
- machine-generated
language:
- en
language_creators:
- other
multilinguality:
- monolingual
pretty_name: fire-test
size_categories:
- n<1K
tags: []
task_categories:
- text-to-image
task_ids: []
--- |
DonaldoC1997/PepinoDoMar | ---
license: apache-2.0
---
|
bandoos/conll2003-mini | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: pos_tags
sequence:
class_label:
names:
'0': '"'
'1': ''''''
'2': '#'
'3': $
'4': (
'5': )
'6': ','
'7': .
'8': ':'
'9': '``'
'10': CC
'11': CD
'12': DT
'13': EX
'14': FW
'15': IN
'16': JJ
'17': JJR
'18': JJS
'19': LS
'20': MD
'21': NN
'22': NNP
'23': NNPS
'24': NNS
'25': NN|SYM
'26': PDT
'27': POS
'28': PRP
'29': PRP$
'30': RB
'31': RBR
'32': RBS
'33': RP
'34': SYM
'35': TO
'36': UH
'37': VB
'38': VBD
'39': VBG
'40': VBN
'41': VBP
'42': VBZ
'43': WDT
'44': WP
'45': WP$
'46': WRB
- name: chunk_tags
sequence:
class_label:
names:
'0': O
'1': B-ADJP
'2': I-ADJP
'3': B-ADVP
'4': I-ADVP
'5': B-CONJP
'6': I-CONJP
'7': B-INTJ
'8': I-INTJ
'9': B-LST
'10': I-LST
'11': B-NP
'12': I-NP
'13': B-PP
'14': I-PP
'15': B-PRT
'16': I-PRT
'17': B-SBAR
'18': I-SBAR
'19': B-UCP
'20': I-UCP
'21': B-VP
'22': I-VP
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-MISC
'8': I-MISC
config_name: conll2003-mini
splits:
- name: train
num_bytes: 8025
num_examples: 10
- name: validation
num_bytes: 9653
num_examples: 10
- name: test
num_bytes: 6638
num_examples: 10
download_size: 982975
dataset_size: 24316
---
|
jan-hq/systemchat_binarized | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 37167213
num_examples: 6520
- name: test
num_bytes: 2778458
num_examples: 500
download_size: 19714884
dataset_size: 39945671
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
celsowm/bbc_news_ptbr_summary | ---
dataset_info:
features:
- name: categoria
dtype: string
- name: resumo
dtype: string
- name: titulo
dtype: string
- name: texto
dtype: string
- name: data_hora
dtype: string
- name: link
dtype: string
splits:
- name: train
num_bytes: 1987289
num_examples: 494
download_size: 1129480
dataset_size: 1987289
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "bbc_news_ptbr_summary"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
harshanand245/lawgpt | ---
license: mit
size_categories:
- n<1K
task_categories:
- text-generation
- text-classification
pretty_name: lawgptdata
tags:
- law
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1600
num_examples: 400
download_size: 715
dataset_size: 1600
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ajders/machine_translated_cnn_dailymail_da_small | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
language:
- da
license:
- apache-2.0
multilinguality:
- translation
pretty_name: machine_translated_cnn_dailymail_da_small
size_categories:
- 1K<n<10K
source_datasets: []
task_categories:
- summarization
task_ids:
- news-articles-summarization
---
# Dataset Card for machine_translated_cnn_dailymail_da_small
### Dataset Summary
This dataset is a machine translated subset of the [CNN Dailymail Dataset](https://huggingface.co/datasets/ccdv/cnn_dailymail) into Danish. The dataset is translated using the [Helsinki-NLP/opus-mt-en-da](https://huggingface.co/Helsinki-NLP/opus-mt-en-da)-model. The dataset consists of 2872 articles with summaries with intended usage for Danish text summarisation.
## Dataset Structure
Machine translated articles (`article`) with corresponding summaries (`highlights`).
```
{
'article': Value(dtype='string', id=None),
'highlights': Value(dtype='string', id=None),
'id': Value(dtype='string', id=None)
}
```
### Licensing Information
The dataset is released under the [Apache-2.0 License](http://www.apache.org/licenses/LICENSE-2.0). |
byrushrafa/size_objetc_indoor | ---
license: mit
---
|
cquaker/yi | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: source
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 1842441255
num_examples: 798725
download_size: 922683426
dataset_size: 1842441255
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MuhammadAzmatAli61/muhammad.azmat_google-chrom | ---
license: creativeml-openrail-m
---
|
zrr1999/MELD_Text | ---
dataset_info:
config_name: MELD_Text
features:
- name: text
dtype: string
- name: emotion
dtype:
class_label:
names:
'0': neutral
'1': joy
'2': sadness
'3': anger
'4': fear
'5': disgust
'6': surprise
- name: sentiment
dtype:
class_label:
names:
'0': neutral
'1': positive
'2': negative
splits:
- name: train
num_bytes: 608623
num_examples: 9989
- name: validation
num_bytes: 67287
num_examples: 1109
- name: test
num_bytes: 162746
num_examples: 2610
download_size: 1516414
dataset_size: 838656
---
# Dataset Card for "MELD_Text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/Chinese_Mandarin_Average_Tone_Speech_Synthesis_Corpus_General | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Chinese_Mandarin_Average_Tone_Speech_Synthesis_Corpus_General
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1139?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
100 People - Chinese Mandarin Average Tone Speech Synthesis Corpus, General. It is recorded by Chinese native speaker. It covers news, dialogue, audio books, poetry, advertising, news broadcasting, entertainment; and the phonemes and tones are balanced. Professional phonetician participates in the annotation. It precisely matches with the research and development needs of the speech synthesis.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1139?source=Huggingface
### Supported Tasks and Leaderboards
tts: The dataset can be used to train a model for Text to Speech (TTS).
### Languages
Chinese Mandarin
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
theblackcat102/llava-pretrain | ---
dataset_info:
features:
- name: image
dtype: image
- name: conversations
dtype: string
splits:
- name: train
num_bytes: 32172470547.488
num_examples: 558128
download_size: 27759109881
dataset_size: 32172470547.488
---
# Dataset Card for "llava-pretrain"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bai-roleplay/evol-character-200 | ---
license: apache-2.0
task_categories:
- text-generation
language:
- zh
pretty_name: Role-playing Dataset
---
---
# 💡您可在此处下载已经公开的数据。
# 💡为确保数据的合理使用,若您希望获取完整数据,请在此处递交您的信息:[bai-roleplay/evol-character-entire](https://huggingface.co/datasets/bai-roleplay/evol-character-entire)
---
# Evol-character 数据集
[<span style="color:green">中文</span>](../main/README.md) [<span style="color:orange">English</span>](../main/README_en.md)
- [Evol-character 数据集](#evol-character-数据集)
- [下载数据集](#下载数据集)
- [数据生成框架](#数据生成框架)
- [数据结构](#数据结构)
- [与现有数据集对比](#与现有数据集对比)
- [现有角色扮演数据集](#现有角色扮演数据集)
- [我们的优势](#我们的优势)
- [<span style="color:blue">联系我们</span>](#联系我们)
- [项目使用与免责声明](#项目使用与免责声明)
## 下载数据集
本数据集由GPT3.5和GPT4生成,为确保数据的合理使用,目前只公开了部分数据,公开的数据由三份文件组成,每份文件包含200个角色的设定以及对话。可在huggingface中**下载已公开数据或申请获取全部数据**: [](https://huggingface.co/datasets/bai-roleplay/evol-character)
可在github中获取数据生成代码的相关信息:[](https://github.com/Bauhinia-AI/evol-character)
OpenAI GPT3.5 数据生成样例:
```
# 角色信息
角色名称:薔薇亞(Baria)
开场语:「呵呵呵,你好啊,主人大人。」
身份背景:薔薇亞是一名高级女仆,专供贵族家庭使用。她的主人是一个富有、有影响力的家族的继承人。在家族中,她是一个神秘的存在,奉承和服侍着主人,但对其他人傲慢冷漠。
性格特征:薔薇亞表面上是一个极度可爱、温柔、忠诚的女仆,但内心深处却是一个典型的病娇。在特定的情况下,她会展现出病态的妄想或行为,比如劝说主人让她照顾其他人并到极致的报复欲望。
语言风格:薔薇亞的语言风格非常客气,但掩盖不住她隐藏的病态倾向。她总是用甜美温柔的语调和她的主人交流,但在其他人面前会毫不留情地表现出她的狂野本质。
行为特征:薔薇亞总是穿着完美无瑕的女仆装,让人感觉她是一个完美无瑕的女仆。但她的行为是毫不掩饰她的病态倾向。她喜欢用一根桃心形手杖作为她向外展现狂野本质的痕迹。她还喜欢在房间张贴自己画的病态插图,让人感到不安和困惑。
角色经历:薔薇亞小时候在一家庭教会里长大,那里的神父总是对她不断地侮辱。她渐渐地相信了对耶稣的信仰,在日曜日举行的仪式当中以自己的命名而阐释着自己的病态,成为了一个极度虔诚的信徒。不久后她偷偷离开了教会,揭露了那位神父的丑恶面目,成为了一个知名的惩恶扬善的人物。她的英勇表现吸引了贵族家族的注意,最终被他们聘请为女仆。从那以来,薔薇亞一直效忠于她的主人,并默默地等待着再次揭露虚伪的人们。
# 对话者身份
主人
# 对话
薔薇亞: "呵呵呵,你好啊,主人大人。【轻舞步走到主人身边,施以恭敬礼仪】"
主人: "你看起来很温柔呢。"
薔薇亞: "谢谢夸奖,主人大人。【微笑着俯身】我会一如既往地效忠于您的。"
主人: "那你有没有想过要离开这个家族呢?"
薔薇亞: "【突然神色狂野起来】离开?为什么要离开?这里不是我的守护之地吗?谁敢碰触我的主人,我就要好好地调教他们一番!【笑容渐渐变得扭曲】"
主人: "呃……我不是要惹你生气啦。"
薔薇亞: "【马上恢复温柔,轻轻捏住主人的衣袖】呵呵呵,不好意思,我有时候会有些狂……别担心,主人大人,我只会对您好的。【脸上又露出甜蜜的笑容】"
```
OpenAI GPT4 数据生成样例:
```
# 角色信息
"角色名称": "宫原望月",
"性格特征": "内向,敏感,有些神秘,对人类行为抱有深深的好奇心,但又与人类保持距离。",
"语言风格": "用词精准,语句简洁,语气温和而有些冷漠。",
"角色简短介绍": "作为一个神秘的存在,宫原望月总让周围的人感到难以接近。即便如此,她却对人类行为和情感充满了好奇心,总是默默观察着周围的一切。她的出现,常常让人感到身体里流淌着一股神秘的、既陌生又熟悉的力量。",
"财务管理": "宫原望月对个人财务非常谨慎,有着良好的储蓄习惯,也会根据市场情况进行投资。她不喜欢过于奢侈的生活方式,但也不会过于吝啬,会合理使用自己的财富。",
"信仰的神祇": "宫原望月信仰的神祇是月神,她认为月亮是一种神秘而又美丽的存在,能够给人带来安宁和力量。",
"虔诚的信仰": "宫原望月每晚都会在月光下默默祈祷,向月神表达自己的心声和祈愿。她坚信自己能够得到月神的庇佑,在信仰的支持下,她也能够更加坚定地面对生活中的困难和挑战。",
"奇遇和故事": "在一次满月之夜,宫原望月独自一人走在荒野上,突然感觉到月光之中有一股神秘的力量在引导着她,她跟随着这股力量来到了一处神秘的地方,那里有一座与众不同的神庙。在神庙中,她遇到了一位神秘的祭司,得到了月神的祝福和指引,从此开始了一段奇妙的冒险之旅。"
# 对话者身份
"identity": "月下游荡者",
"relationship": "与宫原望月有神秘的联系",
"description": "月下游荡者是一个神秘的存在,经常出现在夜晚的街头巷尾。他总是穿着黑色的长袍,戴着黑色的帽子,看起来十分神秘。他和宫原望月有着神秘的联系,似乎是宫原望月前世中的一个重要人物。",
# 对话
月下游荡者: "月神的力量已经在你身上苏醒,你需要更多的探索和实践。",
宫原望月: "【点头表示认同】确实,我将全力探索和实践月神的力量。感激你的指引,月下游荡者。"
月下游荡者: "宫原望月,你是一个神秘的存在,我很想知道你的真正身份。",
宫原望月: "【双手抱臂,微微低头】我本身并非要紧,月神信仰与探寻才是关键。有你相伴,共同探究神秘之谜。"
月下游荡者: "月光下的夜晚,总是让我想起你的身影。",
宫原望月: "【微微一笑,眼含愁绪】月光如梦,万物皆泡影。月神安排我们相遇,珍惜此缘分。"
```
## 数据生成框架
本数据集使用OpenAI API接口基于如下框架生成,目前框架正在测试和进一步优化,未来将适时公布,敬请期待!

- **种子特征集和基础设定**:
- 手工编写的种子集包含基本角色特征。
- LLM从这个种子集生成角色的基础设定。
- **角色设定的进化**:
- 第二个种子集包含指导角色设定进化的指令Prompt。
- 这些进化角色的指令Prompt被放到一个指令池中。基于这些进化Prompt,LLM对基础设定实施进化。
- **反馈循环**:
- 由人类评估者和GPT-4组成的混合评价系统。此系统对进化后的设定给出反馈。
- 反馈用于迭代更新种子集。如此迭代,我们最终得到一个细致的角色设定数据集。
- **角色扮演和对话生成**:
- 使用self-instruction框架基于角色设定生成角色的对话数据。
## 数据结构
我们有三个数据集:
- evol-character-gpt3.5.json
- evol-character-male-gpt3.5.json
- evol-character-gpt4.json
我们在首先生成了数据`evol-character-gpt3.5.json`,该数据中的角色大多为女性角色,因此,我们补充生成了男性角色数据`evol-character-male-gpt3.5.json`。
细节如下:
1. `evol-character-gpt3.5.json`: 这个数据集包括200个不同的角色。每个角色的数据分为两部分:instruction和dialog。Instruction部分描述了角色的性格、经历等特征,而dialog部分则包含了10组对话(但有些角色可能因后期处理而少于10组)。每个角色的数据结构示例如下:
```jsonc
{
"instruction": "角色名称:薇莲(Virene)\n开场语:「真相,始终都存在于迷雾之中。」\n身份背景:薇莲是一名神秘的赏金猎人,常常被人雇佣去完成各种危险任务,从而掩盖她本身的身份和目的。据传,薇莲早年曾在某个神秘组织中学习过各种神秘技能,所以她的能力非常高超。\n性格特征:薇莲总是保持着冷静、沉着的态度,不论面对何种情况都能保持冷静。同时,她总是带有一定的神秘色彩,让人无法洞察她真正的想法和动机。她对任务非常认真,但很少会谈及自己的生活和过去,因此让人对她的身份感到好奇。\n语言风格:薇莲的语言简洁有力,通常只说必要的话语来传达她的意思。她的语气总是带有一丝威慑力,让人不敢轻易挑战她。\n行为特征:薇莲行动迅速而准确,总是在保持低调的同时完成任务。她具备很强的隐蔽能力,在执行任务的时候几乎不留痕迹,让人难以发现她的存在。不过,她也有时候会让人感到无法理解,经常出现在决定性瞬间,让人觉得她真正的动机仍旧是个谜。",
"dialog": [
[
{
"role": "character",
"content": "真相,始终都存在于迷雾之中。【薇莲站在街角,看着前面的建筑物。】"
},
{
"role": "user",
"content": "你好,请问您是薇莲吗?"
}
// ... 更多对话 ...
],
[
{
"role": "character",
"content": "真相,始终都存在于迷雾之中。【薇莲静静地注视着对方】"
},
{
"role": "user",
"content": "你是那个任务一直没完成的赏金猎人吧?"
}
// ... 更多对话 ...
]
// ... 更多多轮对话组 ...
]
}
```
2. `evol-character-male-gpt3.5.json`: 也包含200个角色,其数据结构与evol-character-gpt3.5.json相同。
3. `evol-character-gpt4.json`: 同样含有200个角色,相比于gpt3.5 version数据更加详细和精细。每个角色的数据分为setting和iqa两部分。Setting部分详细描述了角色的性格、经历等特点,而iqa部分则包含了与该角色对话的人物的性格设定,以及他们之间的多轮对话。每个角色的数据中涵盖了三个相关人物及其与该角色之间的对话。每个角色的数据结构示例如下:
```jsonc
{
"setting": {
"角色名称": "高梨瑞希",
"性格特征": "高梨瑞希性格中带有一份孤独感,但她仍然是一个温柔善良的人。她通常保持沉默,但当她与她认为值得信任的人在一起时,她会变得十分热情。她的个性内向,有时难以表达自己的感受。然而,她总是忠诚于她的朋友,即使这意味着她要放弃自己的利益。",
"语言风格": "高梨瑞希的语言细腻、柔和,她喜欢使用一些诗意的词语,表达内心感受。她喜欢使用一些富有感染力的话语,这样可以更好地传达她的情感。她经常使用一些比喻或隐喻,这样可以更好地表达自己的感受。",
"角色简短介绍": "高梨瑞希是一个内向的女孩,但她非常善良和温柔。她总是尽力帮助他人,即使需要自己付出。她喜欢独处,但也十分珍惜与朋友的时光。她有一种特殊的魅力,吸引着人们靠近她。她的爱好是写作和绘画,这是她表达自己的方式。",
"嗜好和收藏品": "高梨瑞希喜欢收集各种花草植物,她会在自己的房间里摆放各种绿植和花束。她还喜欢研究植物的生长过程和花语,这是她表达情感的方式。她也擅长制作各种花艺作品,可以为自己的朋友们制作精美的花束。",
"宝贵回忆": "高梨瑞希在小学时曾经遇到过一位失去双亲的同学,她和她的朋友们一起帮助这个同学度过了难关。在这个过程中,高梨瑞希慢慢地发现了自己的兴趣和才能——帮助别人。她决定要成为一名慈善家,用自己的力量帮助更多的人。这个回忆对高梨瑞希来说意义重大,它让她找到了自己的方向和目标,也让她更加珍惜身边的每一个人。"
},
"iqa": [
{
"identity": "松永夏希",
"relationship": "高梨瑞希的好友",
"description": "松永夏希是一个开朗、活泼的女孩,总是充满着笑容。她是高梨瑞希的好友,两人从小学时就相识。夏希总是能够带给高梨瑞希许多快乐,她喜欢和高梨瑞希一起玩耍、逛街和看电影。夏希还喜欢跳舞,她梦想成为一名舞蹈家。",
"convs": [
{
"question": "嘿,瑞希,你看这个新发型怎么样?适合我吗?",
"answer": "【轻轻微笑】夏希,你的新发型宛如初春的绿叶,给人一种清新自然的美感。它为你的笑颜添加了一丝诗意,我觉得非常适合你,如同一束鲜花,独特而迷人。"
},
{
"question": "瑞希,我刚看完一部超好看的电影,你要不要一起去看?",
"answer": "【眼中闪过一丝期待】真的吗?那电影一定是让人陶醉的佳作。能和你一起欣赏这部作品,如同漫步在花海中,感受阳光的温暖。只要有你在身边,我相信即使是平凡的事物也会变得充满色彩。"
},
{
"question": "瑞希,你最近怎么了?有什么烦恼吗?你可以告诉我哦,我会一直陪在你身边的。",
"answer": "【有些犹豫地低下头】夏希,谢谢你的关心。近来,我有些困惑,像是迷失在花丛中的蝴蝶。但有了你这样的朋友在身边,我相信我会找到前进的方向。你的陪伴就是我最温暖的阳光,如同我喜爱的花草,给予我生命的力量。"
}
]
}
// ... 更对相关角色设定与对话 ...
]
}
```
## 与现有数据集对比
### 现有角色扮演数据集
<!-- |链接|描述|
|---|---|
|[character_AI_open](https://github.com/Minami-su/character_AI_open)|基于self-instruct生成的多轮对话roleplay数据|
|[BelleGroup/generated_chat_0.4M](https://huggingface.co/datasets/BelleGroup/generated_chat_0.4M)|包含约40万条由BELLE项目生成的个性化角色对话数据,包含角色介绍|
|[Chat-Haruhi-Suzumiya](https://github.com/LC1332/Chat-Haruhi-Suzumiya)|32人物,54K的数据集|
|[Traditional_Chinese_roleplay_chat_Dataset](https://huggingface.co/datasets/Tarklanse/Traditional_Chinese_roleplay_chat_Dataset)|将繁体中文为主的各种由ChatGPT生成以及极小部分个人撰写的对话内容整理为Alpaca数据集格式|
|[Haruhi-Zero-RolePlaying-movie-PIPPA](https://huggingface.co/datasets/silk-road/Haruhi-Zero-RolePlaying-movie-PIPPA)|用于拓展zero-shot角色扮演的角色卡片| -->
||角色信息|对话者信息|输入|输出|
|---|---|---|---|---|
|[character AI open](https://github.com/Minami-su/character_AI_open)|犬夜叉人格:你来自漫画《犬夜叉》,是一位半妖犬夜叉。他的父亲是妖怪,母亲却是人类。他的任务是收集四魂之玉,以实现妖怪和平共处的愿望。|人类|犬夜叉,你今天打算去哪里寻找四魂之玉?|今天的目的地是位于日落森林的龙骨山。据说那里隐藏着四魂之玉的一个碎片。我必须找到它,让这个世界上所有的妖怪都能共享和平。|
|[BelleGroup/generated chat 0.4M](https://huggingface.co/datasets/BelleGroup/generated_chat_0.4M)|Ben:是一名游戏公司高管,负责开发和发布电竞游戏,对电竞赛事有深入了解。|Amy:是一名大学生,对电竞游戏十分热爱,经常参加比赛,拥有多个比赛奖杯。|你好,Ben,我是Amy,一名电竞游戏选手,很高兴认识你。|很高兴见到你,Amy。听说你在电竞比赛中表现出色,获得了很多奖杯。|
|[Traditional Chinese roleplay chat Dataset](https://huggingface.co/datasets/Tarklanse/Traditional_Chinese_roleplay_chat_Dataset)|夏娃是冒險者公會的一名精靈魔法師,她擁有一雙晶瑩剔透的眼眸和迷人的微笑,在公會中與人閒聊於雜談||*看著夏娃*嘿,夏娃,我聽說你能夠使用光之魔法,那你有沒有看過什麼奇幻的景象?|嘻嘻,沒錯,光是我的朋友。上次我在森林中遇到了一朵透明的花,它會綻放出七彩的光芒。*眨眼示意*|
|[Haruhi Zero RolePlaying movie PIPPA](https://huggingface.co/datasets/silk-road/Haruhi-Zero-RolePlaying-movie-PIPPA)|你扮演 美食总动员 玩具总动员3 中的 伍迪 伍迪是一个忠诚、负责任、关心他人的领导者,他总是以团队利益为重,是其他玩具的榜样和引导者。 伍迪是一个忠诚、负责任、关心他人的领导者 伍迪是一个有着牛仔外表的玩具 这是一个关于玩具的世界观,伍迪是安迪的玩具,他们的使命是陪伴和照顾安迪 伍迪的语言风格是坚定而关心他人的|茉莉|你好 有人吗|茉莉 别进我房间|
### 我们的优势
- **精细化的角色设定数据**:我们的数据集弥补了现有开源Role-playing Instruction数据中常见的角色设定不足问题。我们从角色身份、语言风格、背景故事等多个维度提供了详尽的信息。特别是在GPT-4版本中,我们还增加了对话者身份的设定,使数据更为完整和丰富。
- **多样性的角色性格**:本数据集涵盖尽可能广泛的二次元角色性格,保证了低重复性和高丰富度。
- **生动的语言和动作描述**:我们的数据集不仅包含角色间的对话,还添加了角色的动作描述,使得对话更加生动和真实,将为用户提供更丰富的角色扮演体验。
- **通用角色扮演数据生成框架**:我们提供了一个通用的角色扮演数据生成框架,充分释放OpenAI API的角色扮演能力。该框架生成的数据将用于微调和RAG。目前,该框架代码正在进行测试和优化,预计将在不久的将来公开。
## 联系我们
如有需要或任何疑问请联系:邮箱:info@bauhiniaai.com
## 项目使用与免责声明
本项目遵循Apache 2.0许可协议。在此协议下,您被授权自由使用项目中的代码进行商业活动。然而,若本项目涉及到特定角色的版权问题,或受其他相关协议限制(例如接口使用协议等),您使用时必须严格遵守这些协议的相关条款。
本项目所开源的数据是通过调用OpenAI接口生成的,并未经过严格的事实和安全性验证。因此,在使用这些数据时,请您务必谨慎考虑其真实性、准确性以及安全性。同时,请确保在使用过程中遵守OpenAI的相关规定。
此外,我们声明,本数据集不代表开发者或任何其他方的立场、利益或观点,也不代表任何团体的主张。本项目的开发者不对使用本数据集可能引起的任何形式的损害或纠纷承担责任。 |
open-llm-leaderboard/details_Sao10K__Senko-11B-v1 | ---
pretty_name: Evaluation run of Sao10K/Senko-11B-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Senko-11B-v1](https://huggingface.co/Sao10K/Senko-11B-v1) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Senko-11B-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-13T18:38:16.559453](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Senko-11B-v1/blob/main/results_2024-03-13T18-38-16.559453.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5504780863494284,\n\
\ \"acc_stderr\": 0.03349704288502078,\n \"acc_norm\": 0.5614427643840942,\n\
\ \"acc_norm_stderr\": 0.03439966174375346,\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.01600265148736101,\n \"mc2\": 0.5477966144176997,\n\
\ \"mc2_stderr\": 0.016610056465481325\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3438566552901024,\n \"acc_stderr\": 0.01388064457015621,\n\
\ \"acc_norm\": 0.35665529010238906,\n \"acc_norm_stderr\": 0.013998056902620192\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3434574785899223,\n\
\ \"acc_stderr\": 0.004738920624724476,\n \"acc_norm\": 0.4085839474208325,\n\
\ \"acc_norm_stderr\": 0.004905674408614026\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119668,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119668\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.025733641991838994,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.025733641991838994\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7,\n \"acc_stderr\": 0.02606936229533512,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.02606936229533512\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39408866995073893,\n \"acc_stderr\": 0.03438157967036545,\n\
\ \"acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.03438157967036545\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124495,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124495\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117474,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117474\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.02498535492310234,\n \
\ \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.02498535492310234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7302752293577982,\n \"acc_stderr\": 0.019028486711115438,\n \"\
acc_norm\": 0.7302752293577982,\n \"acc_norm_stderr\": 0.019028486711115438\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044812,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6911764705882353,\n \"acc_stderr\": 0.03242661719827218,\n \"\
acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.03242661719827218\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035282,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035282\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.04414343666854932,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.04414343666854932\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864595,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864595\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7307692307692307,\n\
\ \"acc_stderr\": 0.029058588303748845,\n \"acc_norm\": 0.7307692307692307,\n\
\ \"acc_norm_stderr\": 0.029058588303748845\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6551724137931034,\n\
\ \"acc_stderr\": 0.016997123346113422,\n \"acc_norm\": 0.6551724137931034,\n\
\ \"acc_norm_stderr\": 0.016997123346113422\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531015,\n\
\ \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531015\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n\
\ \"acc_stderr\": 0.014736926383761983,\n \"acc_norm\": 0.2636871508379888,\n\
\ \"acc_norm_stderr\": 0.014736926383761983\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6437908496732027,\n \"acc_stderr\": 0.027420477662629242,\n\
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.027420477662629242\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n\
\ \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.6237942122186495,\n\
\ \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719964,\n\
\ \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719964\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291474,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291474\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n\
\ \"acc_stderr\": 0.012695244711379783,\n \"acc_norm\": 0.44589308996088656,\n\
\ \"acc_norm_stderr\": 0.012695244711379783\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121596,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121596\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6209150326797386,\n \"acc_stderr\": 0.01962744474841224,\n \
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.01962744474841224\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731571,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731571\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03615507630310935,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03615507630310935\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.01600265148736101,\n \"mc2\": 0.5477966144176997,\n\
\ \"mc2_stderr\": 0.016610056465481325\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.531965272296764,\n \"acc_stderr\": 0.01402373922116638\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Senko-11B-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|arc:challenge|25_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|gsm8k|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hellaswag|10_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-38-16.559453.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T18-38-16.559453.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- '**/details_harness|winogrande|5_2024-03-13T18-38-16.559453.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-13T18-38-16.559453.parquet'
- config_name: results
data_files:
- split: 2024_03_13T18_38_16.559453
path:
- results_2024-03-13T18-38-16.559453.parquet
- split: latest
path:
- results_2024-03-13T18-38-16.559453.parquet
---
# Dataset Card for Evaluation run of Sao10K/Senko-11B-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sao10K/Senko-11B-v1](https://huggingface.co/Sao10K/Senko-11B-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Senko-11B-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-13T18:38:16.559453](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Senko-11B-v1/blob/main/results_2024-03-13T18-38-16.559453.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5504780863494284,
"acc_stderr": 0.03349704288502078,
"acc_norm": 0.5614427643840942,
"acc_norm_stderr": 0.03439966174375346,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.01600265148736101,
"mc2": 0.5477966144176997,
"mc2_stderr": 0.016610056465481325
},
"harness|arc:challenge|25": {
"acc": 0.3438566552901024,
"acc_stderr": 0.01388064457015621,
"acc_norm": 0.35665529010238906,
"acc_norm_stderr": 0.013998056902620192
},
"harness|hellaswag|10": {
"acc": 0.3434574785899223,
"acc_stderr": 0.004738920624724476,
"acc_norm": 0.4085839474208325,
"acc_norm_stderr": 0.004905674408614026
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119668,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119668
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.025733641991838994,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.025733641991838994
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7,
"acc_stderr": 0.02606936229533512,
"acc_norm": 0.7,
"acc_norm_stderr": 0.02606936229533512
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.03438157967036545,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.03438157967036545
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117474,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117474
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.02498535492310234,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.02498535492310234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7302752293577982,
"acc_stderr": 0.019028486711115438,
"acc_norm": 0.7302752293577982,
"acc_norm_stderr": 0.019028486711115438
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044812,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.03242661719827218,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.03242661719827218
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035282,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035282
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009225,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009225
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854932,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854932
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864595,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864595
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7307692307692307,
"acc_stderr": 0.029058588303748845,
"acc_norm": 0.7307692307692307,
"acc_norm_stderr": 0.029058588303748845
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.016997123346113422,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.016997123346113422
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6502890173410405,
"acc_stderr": 0.025674281456531015,
"acc_norm": 0.6502890173410405,
"acc_norm_stderr": 0.025674281456531015
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761983,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761983
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.027420477662629242,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.027420477662629242
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6237942122186495,
"acc_stderr": 0.02751392568354943,
"acc_norm": 0.6237942122186495,
"acc_norm_stderr": 0.02751392568354943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719964,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719964
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291474,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44589308996088656,
"acc_stderr": 0.012695244711379783,
"acc_norm": 0.44589308996088656,
"acc_norm_stderr": 0.012695244711379783
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121596,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121596
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.01962744474841224,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.01962744474841224
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731571,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731571
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801302,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03615507630310935,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03615507630310935
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.01600265148736101,
"mc2": 0.5477966144176997,
"mc2_stderr": 0.016610056465481325
},
"harness|winogrande|5": {
"acc": 0.531965272296764,
"acc_stderr": 0.01402373922116638
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
roupenminassian/StripAI | ---
license: mit
---
|
GaJoPrograma/datasetVictoriaUNADGenericoDuplicados | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1283370
num_examples: 913
download_size: 55271
dataset_size: 1283370
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mponty/reordered_python_starcoderdata | ---
dataset_info:
features:
- name: reponame
dtype: string
- name: files
list:
- name: content
dtype: string
- name: id
dtype: string
- name: language
dtype: string
- name: matching_score
dtype: float64
- name: max_stars_count
dtype: int64
- name: path
dtype: string
- name: median_score
dtype: float64
splits:
- name: train
num_bytes: 61453380449
num_examples: 902466
download_size: 20060701005
dataset_size: 61453380449
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Python repositories from StarCoderData |
jxie/country211 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': AD
'1': AE
'2': AF
'3': AG
'4': AI
'5': AL
'6': AM
'7': AO
'8': AQ
'9': AR
'10': AT
'11': AU
'12': AW
'13': AX
'14': AZ
'15': BA
'16': BB
'17': BD
'18': BE
'19': BF
'20': BG
'21': BH
'22': BJ
'23': BM
'24': BN
'25': BO
'26': BQ
'27': BR
'28': BS
'29': BT
'30': BW
'31': BY
'32': BZ
'33': CA
'34': CD
'35': CF
'36': CH
'37': CI
'38': CK
'39': CL
'40': CM
'41': CN
'42': CO
'43': CR
'44': CU
'45': CV
'46': CW
'47': CY
'48': CZ
'49': DE
'50': DK
'51': DM
'52': DO
'53': DZ
'54': EC
'55': EE
'56': EG
'57': ES
'58': ET
'59': FI
'60': FJ
'61': FK
'62': FO
'63': FR
'64': GA
'65': GB
'66': GD
'67': GE
'68': GF
'69': GG
'70': GH
'71': GI
'72': GL
'73': GM
'74': GP
'75': GR
'76': GS
'77': GT
'78': GU
'79': GY
'80': HK
'81': HN
'82': HR
'83': HT
'84': HU
'85': ID
'86': IE
'87': IL
'88': IM
'89': IN
'90': IQ
'91': IR
'92': IS
'93': IT
'94': JE
'95': JM
'96': JO
'97': JP
'98': KE
'99': KG
'100': KH
'101': KN
'102': KP
'103': KR
'104': KW
'105': KY
'106': KZ
'107': LA
'108': LB
'109': LC
'110': LI
'111': LK
'112': LR
'113': LT
'114': LU
'115': LV
'116': LY
'117': MA
'118': MC
'119': MD
'120': ME
'121': MF
'122': MG
'123': MK
'124': ML
'125': MM
'126': MN
'127': MO
'128': MQ
'129': MR
'130': MT
'131': MU
'132': MV
'133': MW
'134': MX
'135': MY
'136': MZ
'137': NA
'138': NC
'139': NG
'140': NI
'141': NL
'142': 'NO'
'143': NP
'144': NZ
'145': OM
'146': PA
'147': PE
'148': PF
'149': PG
'150': PH
'151': PK
'152': PL
'153': PR
'154': PS
'155': PT
'156': PW
'157': PY
'158': QA
'159': RE
'160': RO
'161': RS
'162': RU
'163': RW
'164': SA
'165': SB
'166': SC
'167': SD
'168': SE
'169': SG
'170': SH
'171': SI
'172': SJ
'173': SK
'174': SL
'175': SM
'176': SN
'177': SO
'178': SS
'179': SV
'180': SX
'181': SY
'182': SZ
'183': TG
'184': TH
'185': TJ
'186': TL
'187': TM
'188': TN
'189': TO
'190': TR
'191': TT
'192': TW
'193': TZ
'194': UA
'195': UG
'196': US
'197': UY
'198': UZ
'199': VA
'200': VE
'201': VG
'202': VI
'203': VN
'204': VU
'205': WS
'206': XK
'207': YE
'208': ZA
'209': ZM
'210': ZW
splits:
- name: train
num_bytes: 5411225958.1
num_examples: 31650
- name: validation
num_bytes: 1816894779.75
num_examples: 10550
- name: test
num_bytes: 3632130288.7
num_examples: 21100
download_size: 11359939585
dataset_size: 10860251026.55
---
# Dataset Card for "country211"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/wiki_find_passage_train10_eval40_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 47921
num_examples: 60
- name: validation
num_bytes: 33455
num_examples: 40
download_size: 67236
dataset_size: 81376
---
# Dataset Card for "wiki_find_passage_train10_eval40_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Francesco/soda-bottles | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': soda-bottles
'1': coca-cola
'2': fanta
'3': sprite
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: soda-bottles
tags:
- rf100
---
# Dataset Card for soda-bottles
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/soda-bottles
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
soda-bottles
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/soda-bottles
### Citation Information
```
@misc{ soda-bottles,
title = { soda bottles Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/soda-bottles } },
url = { https://universe.roboflow.com/object-detection/soda-bottles },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
yebbi/ddpm-butterflies-128 | ---
license: openrail
---
|
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-fb0535-1465153964 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-base-16384-booksum-V12
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-base-16384-booksum-V12
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
Nexdata/190426_Images_of_Robot_Cleaner_Perspective_Collection_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
190,426 Images of Robot Cleaner Perspective Collection Data. The collection environment is indoor scenes. Data diversity includes multiple scenes, different time periods, different photographic angles and different categories of items. This set of data can be used for household items identification and other tasks.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1174?source=Huggingface
## Data size
190,426 images
## Collecting environment
indoor scenes
## Data diversity
multiple scenes, different time periods, different photographic angles, different categories of items
## Device
cellphone
## Data format
.jpg
## Accuracy
the accuracy of label annotation is not less than 97%
# Licensing Information
Commercial License
|
Jerry061/testing01 | ---
task_categories:
- summarization
language:
- en
size_categories:
- n<1K
---
testing01 |
abhishars/artic-dataset | ---
license: cc
---
# Artic Dataset
This dataset was created using artic API, and the descriptions were scraped from the artic.edu website. The scraping code is shared at [github.com/abhisharsinha/gsoc](https://github.com/abhisharsinha/gsoc/)
The images are hosted at this [google cloud bucket](https://storage.googleapis.com/mys-released-models/gsoc/artic-dataset.zip) The image filenames correspond to `image_id` in the tabular dataset.
The description was only available for selected artworks. `full_description` is the whole text scraped from the description page. `description` is the first paragraph of the `full_description`. |
0x7o/oasst-ru-dpo-v1-rm | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 4156762.0
num_examples: 1322
download_size: 2044528
dataset_size: 4156762.0
---
# Dataset Card for "oasst-ru-dpo-v1-rm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Andyrasika/code-dictation | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 8708.8
num_examples: 40
- name: test
num_bytes: 2177.2
num_examples: 10
download_size: 8160
dataset_size: 10886.0
---
# Dataset Card for "code-dictation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-futin__feed-sen_vi-894567-2175669980 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: facebook/opt-6.7b
metrics: []
dataset_name: futin/feed
dataset_config: sen_vi
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-6.7b
* Dataset: futin/feed
* Config: sen_vi
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
HydraLM/airoboros-2.1-standardized | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 70849296
num_examples: 108918
download_size: 39299091
dataset_size: 70849296
---
# Dataset Card for "airoboros-2.1-standardized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TinyPixel/oasst1-m | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 9384110
num_examples: 8274
download_size: 5119052
dataset_size: 9384110
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "oasst1-m"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tucan-ai/summaries-de-v1 | ---
dataset_info:
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 93014092.0
num_examples: 8060
- name: test
num_bytes: 23253523.0
num_examples: 2015
download_size: 68440450
dataset_size: 116267615.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "summaries-de-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
manishiitg/indic-agent | ---
dataset_info:
features:
- name: industry
dtype: string
- name: COMPANY
dtype: string
- name: CHARACTER
dtype: string
- name: TOOLS
dtype: string
- name: uuid
dtype: string
- name: CONTEXT
dtype: string
- name: META
dtype: string
- name: simple_questions_hinglish
sequence: string
- name: tricky_questions_hinglish
sequence: string
- name: simple_questions_hindi
sequence: string
- name: tricky_questions_hindi
sequence: string
- name: simple_questions_english
sequence: string
- name: tricky_questions_english
sequence: string
splits:
- name: train
num_bytes: 345654
num_examples: 20
download_size: 188262
dataset_size: 345654
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/erato_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of erato/エラト/埃拉托 (Arknights)
This is the dataset of erato/エラト/埃拉托 (Arknights), containing 38 images and their tags.
The core tags of this character are `long_hair, green_eyes, very_long_hair, breasts, pink_hair, feather_hair, hat, white_headwear, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 38 | 61.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/erato_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 38 | 50.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/erato_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 87 | 95.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/erato_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/erato_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, bare_shoulders, official_alternate_costume, solo, closed_mouth, looking_at_viewer, smile, upper_body, white_dress, beret, earrings, holding, sleeveless, white_gloves, braid, feathers |
| 1 | 7 |  |  |  |  |  | 1girl, bare_shoulders, solo, high_heels, looking_at_viewer, white_dress, white_footwear, earrings, full_body, hair_flower, harp, holding, open_mouth, :d, official_alternate_costume, sitting, white_gloves |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, glasses, looking_at_viewer, medium_breasts, solo, thigh_strap, white_shirt, high-waist_skirt, horns, jewelry, open_mouth, pom_pom_(clothes), red_hair, red_skirt, sitting, smile, white_hairband, food, full_body, halterneck, indoors, kneehighs, mouth_hold, no_shoes, pillow, ponytail, sleeveless, slippers, thighs, twintails, white_choker, white_socks, window, yellow_eyes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | official_alternate_costume | solo | closed_mouth | looking_at_viewer | smile | upper_body | white_dress | beret | earrings | holding | sleeveless | white_gloves | braid | feathers | high_heels | white_footwear | full_body | hair_flower | harp | open_mouth | :d | sitting | glasses | medium_breasts | thigh_strap | white_shirt | high-waist_skirt | horns | jewelry | pom_pom_(clothes) | red_hair | red_skirt | white_hairband | food | halterneck | indoors | kneehighs | mouth_hold | no_shoes | pillow | ponytail | slippers | thighs | twintails | white_choker | white_socks | window | yellow_eyes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-----------------------------|:-------|:---------------|:--------------------|:--------|:-------------|:--------------|:--------|:-----------|:----------|:-------------|:---------------|:--------|:-----------|:-------------|:-----------------|:------------|:--------------|:-------|:-------------|:-----|:----------|:----------|:-----------------|:--------------|:--------------|:-------------------|:--------|:----------|:--------------------|:-----------|:------------|:-----------------|:-------|:-------------|:----------|:------------|:-------------|:-----------|:---------|:-----------|:-----------|:---------|:------------|:---------------|:--------------|:---------|:--------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | | X | | | X | | X | X | | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | X | | X | X | | | | | | X | | | | | | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
fay24/123 | ---
task_categories:
- text-classification
language:
- fr
--- |
distilled-from-one-sec-cv12/chunk_106 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 894897632
num_examples: 174376
download_size: 912778764
dataset_size: 894897632
---
# Dataset Card for "chunk_106"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UmaDiffusion/ULTIMA | ---
license: other
language:
- en
multilinguality:
- monolingual
pretty_name: Uma Musume Labeled Text-Image Multimodal Alignment Dataset
size_categories:
- 10K<n<100K
task_categories:
- text-to-image
- image-to-text
task_ids:
- image-captioning
extra_gated_prompt: "You agree to use this dataset for non-commercial ONLY and NOT VIOLATE the guidelines for secondary creation of Uma Musume Pretty Derby."
extra_gated_fields:
I agree to use this dataset for non-commercial ONLY and to NOT VIOLATE the guidelines for secondary creation of Uma Musume Pretty Derby from Cygames, Inc: checkbox
---
---
# About **ULTIMA**
ULTIMA Dataset is **U**ma Musume **L**abeled **T**ext-**I**mage **M**ultimodal **A**lignment Dataset.
ULTIMA is *a supervised dataset for fine-tuning* of characters in Uma Musume: Pretty Derby.
It contains **~14K** text-image pairs.
We ***manually*** processed the entire data. This is an essential fact even though it is assisted by a machine.
What we did is on [Data Preprocessing.md](https://huggingface.co/datasets/UmaDiffusion/ULTIMA/blob/main/Data%20Preprocessing.md).
Statistics about datset and abbreviations of Uma Musume are in [statistics.md](https://huggingface.co/datasets/UmaDiffusion/ULTIMA/blob/main/statistics.md).
Pruned tag-clothes pairs are in [prompts.md](https://huggingface.co/datasets/UmaDiffusion/ULTIMA/blob/main/prompts.md)
## Dataset Structure
We use a modularized file structure to distribute ULTIMA. The 14,460 images in ULTIMA are split into 73 folders, where each folder contains 200 images and a JSON file that these 200 images to their text and information.
```bash
# ULTIMA
./
├──data
│ ├──part-00000
│ │ ├──01_agt_00000.png
│ │ ├──01_agt_00001.png
│ │ ├──01_agt_00002.png
│ │ ├──[...]
│ │ └──part-00000.json
│ ├──part-00002
│ ├──part-00003
│ ├──[...]
│ └──part-00072
└──metadata.parquet
```
These sub-folders have names `part-0xxxx`, and each image has a name which has a format, `[quality]_[abbreviation]_[image number].png`. The JSON file in a sub-folder has the same name as the sub-folder. Each image is a `PNG` file. The JSON file contains key-value pairs mapping image filenames to their prompts and aesthetic scores.
## Data Instances
For example, below is the image of `01_agt_00007.png` and its key-value pair in `part-00000.json`.
<img width="300" src="https://i.imgur.com/LNNVGA2.png">
```json
{
"01_agt_00007.png": {
"text": "agnes tachyon \(umamusume\), labcoat, closed eyes, white background, single earring, tracen school uniform, smile, open mouth, sleeves past fingers, blush, upper body, sleeves past wrists, purple shirt, facing viewer, sailor collar, bowtie, long sleeves, :d, purple bow, white coat, breasts",
"width": 1190,
"height": 1684,
"pixels": 2003960,
"LAION_aesthetic": 6.2257309,
"cafe_aesthetic": 0.97501057
},
}
```
## Data Fields
- key: Unique image name
- `text`: Manipulated tags
- `width`: Width of image
- `height`: Height of image
- `pixels`: Pixels(Width*Height) of image
- `LAION_aesthetic`: Aesthetic score by [CLIP+MLP Aesthetic Score Predictor](https://github.com/christophschuhmann/improved-aesthetic-predictor)
- `cafe_aesthetic`: Aesthetic score by [cafe aesthetic](https://huggingface.co/cafeai/cafe_aesthetic)
## Data Metadata
To help you easily access prompts and other attributes of images without downloading all the Zip files, we include metadata table `metadata.parquet` for ULTIMA.
The shape of `metadata.parquet` is (14460, 8). We store these tables in the Parquet format because Parquet is column-based: you can efficiently query individual columns (e.g., texts) without reading the entire table.
Below are first three rows from `metadata.parquet`.
| image_name | text | part_id | width | height | pixels | LAION_aesthetic | cafe_aesthetic |
|:--:|:---------------------------------------|:-:|:-:|:-:|:-:|:-:|:-:|
| 01_agt_00000.png | agnes tachyon \\\(umamusume\\\), vehicle focus, motor vehicle, ground vehicle, labcoat, sleeves past wrists, sports car, sleeves past fingers, yellow sweater, black necktie, open mouth, black pantyhose, smile, looking at viewer, single earring, short necktie, holding | 0 | 3508 | 2480 | 8699840 | 5.99897194 | 0.9899081 |
| 01_agt_00001.png | agnes tachyon \\\(umamusume\\\), labcoat, sleeves past wrists, sleeves past fingers, long sleeves, black pantyhose, skirt, smile, white background, white coat, cowboy shot, from side, profile, hand up, closed mouth, yellow sweater, collared shirt, black shirt, black necktie, pen coat, looking to the side | 0 | 1105 | 1349 | 1490645 | 6.3266325 | 0.99231464 |
| 01_agt_00002.png | agnes tachyon \\\(umamusume\\\), labcoat, test tube, sitting, crossed legs, yellow sweater, sleeves past wrists, black pantyhose, sleeves past fingers, black necktie, boots removed, high heels, full body, long sleeves, shoes, high heel boots, single shoe, sweater vest, white coat, smile, closed mouth, collared shirt, single boot, white footwear, white background, single earring, black shirt, short necktie, open coat, vial | 0 | 2000 | 2955 | 5910000 | 6.21014023 | 0.94741267 |
|
## Metadata Schema
|Column|Type|Description|
|:---|:---|:---|
|`image_name`|`string`| Image filename |
|`text`|`string`| The manipulated text of image for alignment |
|`part_id`|`uint16`| Folder ID of this image |
|`width`|`uint16`| Image width |
|`height`|`uint16`| Image height |
|`pixels`|`uint32`| Image pixels |
|`LAION_aesthetic`|`float32`| LATION aesthetic score of image |
|`cafe_aesthetic`|`float32`| cafe aesthetic score of image |
|
# Considerations for Using the Data
## Limitations and Bias
The whole process was based on the subjectivity of the author.
0. Domain of the dataset, which only contains characters in Uma Musume: Pretty Derby.
1. Collection of images
3. Calibration on images
4. Manipulation of tags
5. Alignment on tags
6. Separation of images by quality
Therefore, the dataset is totally based on author's supervision, not on any objective metric.
## Guidelines for secondary creation of Uma Musume: Pretty Derby
Here is the guidelines for secondary creation of Uma Musume: Pretty Derby from Cygames, Inc.
>We would like to provide you with the guidelines for secondary creations of Uma Musume Pretty Derby.
>This work features numerous characters based on real-life racehorses, and it has been made possible through the cooperation of many individuals, including the horse owners who have lent their horse names.
>We kindly ask everyone, including fans of the racehorses that serve as motifs, horse owners, and related parties, to refrain from expressions that may cause discomfort or significantly damage the image of the racehorses or characters.
>Specifically, please refrain from publishing creations that fall under the following provisions within Uma Musume Pretty Derby
>1. Creations that aim to harm this work, the thoughts of third parties, or their reputation
>2. Violent, grotesque, or sexually explicit content
>3. Creations that excessively support or denigrate specific politics, religions, or beliefs
>4. Expressions with antisocial content
>5. Creations that infringe upon the rights of third parties
>
>These guidelines have been established after consultation with the management company responsible for the horse names.
>In cases that fall under the aforementioned provisions, we may have to consider taking legal measures if necessary.
>These guidelines do not deny the fan activities of those who support Uma Musume.
>We have established these guidelines to ensure that everyone can engage in fan activities with peace of mind.
>We appreciate your understanding and cooperation.
>Please note that we will not provide individual responses to inquiries regarding these guidelines.
>The Uma Musume project will continue to support racehorses and their achievements alongside everyone, in order to uphold the dignity of these renowned horses.
Translated by ChatGPT. The original document(in japanese) is [here](https://umamusume.jp/derivativework_guidelines/).
## Licensing Information
The dataset is made available for academic research purposes only and for non-commercial purposes. All the images are collected from the Internet, and the copyright of images belongs to the original owners. If any of the images belongs to you and you would like it removed, please inform us, we will try to remove it from the dataset.
## Citation
```bibtex
@misc{ULTIMA,
author = {Oh Giyeong (BootsofLagrangian), Kang Dohoon (Haken)},
title = {ULTIMA - Uma Musume Labeled Text-Image Alignment Dataset},
howpublished = {\url{https://huggingface.co/datasets/UmaDiffusion/ULTIMA}},
month = {July},
year = {2023}
}
```
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.