datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
roszcz/qrs-swipe-demo | ---
dataset_info:
features:
- name: time
dtype: timestamp[us]
- name: label
dtype: string
- name: position
dtype: int64
- name: is_beat
dtype: bool
- name: signal
dtype:
array2_d:
shape:
- 1200
- 3
dtype: int16
- name: exam_uid
dtype: string
- name: to_review
dtype: bool
splits:
- name: train
num_bytes: 24130500
num_examples: 2000
download_size: 10055162
dataset_size: 24130500
---
# Dataset Card for "qrs-swipe-demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mwalton/olamina | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_mrpc_present_for_exp_perfect | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 44661
num_examples: 157
- name: train
num_bytes: 105038
num_examples: 385
- name: validation
num_bytes: 12476
num_examples: 46
download_size: 117930
dataset_size: 162175
---
# Dataset Card for "MULTI_VALUE_mrpc_present_for_exp_perfect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yubaiscat/SDRS | ---
task_categories:
- text-to-image
language:
- en
- zh
pretty_name: RS
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
fathyshalab/google-presto | ---
license: cc-by-4.0
---
|
dhavala/autotrain-data-square-count-classifier | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: square-count-classifier
## Dataset Description
This dataset has been automatically processed by AutoTrain for project square-count-classifier.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<28x28 L PIL image>",
"target": 0
},
{
"image": "<28x28 L PIL image>",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['green', 'red'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 394 |
| valid | 40 |
|
tglcourse/latent_celebA_256px | ---
dataset_info:
features:
- name: latent
sequence:
sequence:
sequence: float32
splits:
- name: train
num_bytes: 3427164684
num_examples: 202599
download_size: 3338993120
dataset_size: 3427164684
---
# Dataset Card for "latent_celebA_256px"
Each image is cropped to 256px square and encoded to a 4x32x32 latent representation using the same VAE as that employed by Stable Diffusion
Decoding
```python
from diffusers import AutoencoderKL
from datasets import load_dataset
from PIL import Image
import numpy as np
import torch
# load the dataset
dataset = load_dataset('tglcourse/latent_celebA_256px')
# Load the VAE (requires access - see repo model card for info)
vae = AutoencoderKL.from_pretrained("CompVis/stable-diffusion-v1-4", subfolder="vae")
latent = torch.tensor([dataset['train'][0]['latent']]) # To tensor (bs, 4, 32, 32)
latent = (1 / 0.18215) * latent # Scale to match SD implementation
with torch.no_grad():
image = vae.decode(latent).sample[0] # Decode
image = (image / 2 + 0.5).clamp(0, 1) # To (0, 1)
image = image.detach().cpu().permute(1, 2, 0).numpy() # To numpy, channels lsat
image = (image * 255).round().astype("uint8") # (0, 255) and type uint8
image = Image.fromarray(image) # To PIL
image # The resulting PIL image
``` |
Traxap/Pile_Tokenized | ---
dataset_info:
features:
- name: text
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 1197349778214
num_examples: 134318121
download_size: 606495022503
dataset_size: 1197349778214
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bhlim/test-dataset | ---
dataset_info:
features:
- name: filename
dtype: string
- name: cik
dtype: string
- name: year
dtype: string
- name: section_1
dtype: string
- name: section_1A
dtype: string
- name: section_1B
dtype: string
- name: section_2
dtype: string
- name: section_3
dtype: string
- name: section_4
dtype: string
- name: section_5
dtype: string
- name: section_6
dtype: string
- name: section_7
dtype: string
- name: section_7A
dtype: string
- name: section_8
dtype: string
- name: section_9
dtype: string
- name: section_9A
dtype: string
- name: section_9B
dtype: string
- name: section_10
dtype: string
- name: section_11
dtype: string
- name: section_12
dtype: string
- name: section_13
dtype: string
- name: section_14
dtype: string
- name: section_15
dtype: string
splits:
- name: train
num_bytes: 4002238818
num_examples: 19483
download_size: 1756356710
dataset_size: 4002238818
---
# Dataset Card for "test-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nc33/cross_en_laws | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: label
dtype: float64
- name: is_answer
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 310300510
num_examples: 189507
download_size: 80495498
dataset_size: 310300510
---
# Dataset Card for "cross_en_laws"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Uggh/TopHipHopAlbumsReg | ---
license: mit
---
|
DZN222/morador | ---
license: openrail
---
|
jainabh/smart-contract-LLM-FT | ---
dataset_info:
features:
- name: Malicious
dtype: string
- name: Fixed
dtype: string
splits:
- name: train
num_bytes: 14536
num_examples: 14
download_size: 10134
dataset_size: 14536
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "smart-contract-LLM-FT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liblinear/eng-russian-paintings-t2i | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 183321706.6
num_examples: 1670
download_size: 183078116
dataset_size: 183321706.6
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
PeterBrendan/Ads_Creative_Text_Programmatic | ---
license: mit
---
### Dataset Summary
The Programmatic Ad Creatives dataset contains 1000 samples of online programmatic ad creatives along with their ad sizes. The dataset includes 8 unique ad sizes, such as (300, 250), (728, 90), (970, 250), (300, 600), (160, 600), (970, 90), (336, 280), and (320, 50). The dataset is in a tabular format and represents a random sample from Project300x250.com's complete creative data set. It is primarily used for training and evaluating natural language processing models in the context of advertising creatives.
### Supported Tasks
This dataset supports a range of tasks, including language modeling, text generation, and text augmentation. The full dataset has been utilized to fine-tune open-source models for creative ad copy. We hope this dataset will inspire contributors to join [Project 300x250](https://www.Project300x250.com) in creating open-source alternatives to Google and Meta, ensuring the existence of independent advertising.
### Languages
The dataset primarily consists of English language text.
### Dataset Structure
#### Data Fields
The dataset contains the following fields:
- 'text': Represents the text collected from the programmatic ad creative.
- 'dimensions': Represents the dimensions of the creative ad size.
#### Data Splits
The data is not split into separate subsets; it is provided as a whole.
## Dataset Creation
### Curation Rationale
The dataset of online programmatic ad creatives was curated to serve as a valuable resource for researchers and developers. It provides a unique collection of advertising creative text that is typically only available within walled gardens. The dataset aims to foster the development of independent advertising alternatives to Google and Meta, particularly in the field of AI, by promoting open-source solutions in the advertising domain.
### Source Data
The data is generated from a vast collection of programmatic creative images hosted by [Project 300x250](https://www.Project300x250.com)
. The text was extracted from each creative image.
## Dataset Use
### Use Cases
The dataset can be used for various tasks related to language understanding, natural language processing, machine learning model training, and model performance evaluation. Initially, the dataset has been utilized to fine-tune open-source models using programmatic ad text to generate unique ad copy. These models were created to inspire ad creatives and provide a starting point for developing effective marketing content.
### Usage Caveats
As this dataset is a sampled subset, it is recommended to regularly check for updates and improvements or reach out to the author for access to the full dataset.
|
sana280/mini-validation-dataset | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 573627
num_examples: 100
download_size: 238902
dataset_size: 573627
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/danua_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of danua/ダヌア (Granblue Fantasy)
This is the dataset of danua/ダヌア (Granblue Fantasy), containing 259 images and their tags.
The core tags of this character are `horns, long_hair, breasts, pointy_ears, red_eyes, black_hair, large_breasts, antenna_hair, horn_ornament, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 259 | 310.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/danua_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 259 | 196.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/danua_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 640 | 424.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/danua_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 259 | 284.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/danua_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 640 | 554.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/danua_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/danua_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, draph, solo, looking_at_viewer, nipples, nude, blush, navel, huge_breasts, bandaged_arm, pussy, simple_background, white_background |
| 1 | 5 |  |  |  |  |  | 1girl, black_gloves, draph, fingerless_gloves, looking_at_viewer, necklace, solo, bandaged_arm, crescent, simple_background, white_background, blush, nipples, white_dress, blood, cleavage |
| 2 | 12 |  |  |  |  |  | 1girl, cleavage, draph, looking_at_viewer, official_alternate_costume, solo, white_bikini, necklace, bandaged_arm, navel, blush |
| 3 | 7 |  |  |  |  |  | 1girl, cleavage, draph, looking_at_viewer, necklace, official_alternate_costume, solo, bandaged_arm, doll, navel, purple_hair, white_bikini, blush, front-tie_top, innertube, crescent, water |
| 4 | 5 |  |  |  |  |  | 1girl, bandaged_arm, cleavage, draph, navel, necklace, official_alternate_costume, purple_hair, side-tie_bikini_bottom, solo, white_bikini, looking_at_viewer, simple_background, white_background, crescent, finger_to_mouth, front-tie_bikini_top |
| 5 | 9 |  |  |  |  |  | 1boy, 1girl, draph, hetero, nipples, nude, sex, blush, penis, solo_focus, vaginal, bandaged_arm, censored, girl_on_top, cowgirl_position, cum_in_pussy, navel, open_mouth, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | draph | solo | looking_at_viewer | nipples | nude | blush | navel | huge_breasts | bandaged_arm | pussy | simple_background | white_background | black_gloves | fingerless_gloves | necklace | crescent | white_dress | blood | cleavage | official_alternate_costume | white_bikini | doll | purple_hair | front-tie_top | innertube | water | side-tie_bikini_bottom | finger_to_mouth | front-tie_bikini_top | 1boy | hetero | sex | penis | solo_focus | vaginal | censored | girl_on_top | cowgirl_position | cum_in_pussy | open_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------------------|:----------|:-------|:--------|:--------|:---------------|:---------------|:--------|:--------------------|:-------------------|:---------------|:--------------------|:-----------|:-----------|:--------------|:--------|:-----------|:-----------------------------|:---------------|:-------|:--------------|:----------------|:------------|:--------|:-------------------------|:------------------|:-----------------------|:-------|:---------|:------|:--------|:-------------|:----------|:-----------|:--------------|:-------------------|:---------------|:-------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | X | X | X | | | X | X | | X | | | | | | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | X | | | X | X | | X | | | | | | X | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | X | | | | X | | X | | X | X | | | X | X | | | X | X | X | | X | | | | X | X | X | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | X | | | X | X | X | X | | X | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
oshizo/japanese-wikipedia-paragraphs-embeddings | ---
license: cc-by-sa-4.0
language:
- ja
---
The following data set was vectorized with the [intfloat/multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) model and an index file created by faiss.
[oshizo/japanese-wikipedia-paragraphs](https://huggingface.co/datasets/oshizo/japanese-wikipedia-paragraphs)
## Usage
First, download index_me5-base_IVF2048_PQ192.faiss from this repository.
```python
import faiss
import datasets
from sentence_transformers import SentenceTransformer
ds = datasets.load_dataset("oshizo/japanese-wikipedia-paragraphs", split="train")
index = faiss.read_index("./index_me5-base_IVF2048_PQ192.faiss")
model = SentenceTransformer("intfloat/multilingual-e5-base")
question = "日本で二番目に高い山は?"
emb = model.encode(["query: " + question])
scores, indexes = index.search(emb, 10)
scores = scores[0]
indexes = indexes[0]
results = []
for idx, score in zip(indexes, scores):
idx = int(idx)
passage = ds[idx]
passage["score"] = score
results.append((passage))
|
epinnock/oss-instruct-evol-2k | ---
license: mit
dataset_info:
features:
- name: input
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
list:
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_generation_responses
sequence: string
- name: instructions
sequence: string
splits:
- name: train
num_bytes: 18388420
num_examples: 1351
download_size: 6131521
dataset_size: 18388420
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v0.2-Selfplay-v0 | ---
pretty_name: Evaluation run of azarafrooz/Mistral-7B-Instruct-v0.2-Selfplay-v0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [azarafrooz/Mistral-7B-Instruct-v0.2-Selfplay-v0](https://huggingface.co/azarafrooz/Mistral-7B-Instruct-v0.2-Selfplay-v0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v0.2-Selfplay-v0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T08:54:43.402742](https://huggingface.co/datasets/open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v0.2-Selfplay-v0/blob/main/results_2024-03-11T08-54-43.402742.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6059644520922206,\n\
\ \"acc_stderr\": 0.033145711309245145,\n \"acc_norm\": 0.6104537752880655,\n\
\ \"acc_norm_stderr\": 0.033817539833145185,\n \"mc1\": 0.5238678090575275,\n\
\ \"mc1_stderr\": 0.017483547156961564,\n \"mc2\": 0.6735401072237311,\n\
\ \"mc2_stderr\": 0.015277552131369257\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5793515358361775,\n \"acc_stderr\": 0.01442621125250839,\n\
\ \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.01412459788184446\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.664708225453097,\n\
\ \"acc_stderr\": 0.004711275408138424,\n \"acc_norm\": 0.8474407488548098,\n\
\ \"acc_norm_stderr\": 0.00358827287485248\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.04043461861916747,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.04043461861916747\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36507936507936506,\n \"acc_stderr\": 0.02479606060269995,\n \"\
acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.02479606060269995\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.632258064516129,\n\
\ \"acc_stderr\": 0.02743086657997347,\n \"acc_norm\": 0.632258064516129,\n\
\ \"acc_norm_stderr\": 0.02743086657997347\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153303,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153303\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5641025641025641,\n \"acc_stderr\": 0.025141801511177495,\n\
\ \"acc_norm\": 0.5641025641025641,\n \"acc_norm_stderr\": 0.025141801511177495\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.038969819642573754,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.038969819642573754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.017324352325016012,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.017324352325016012\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n\
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.014836205167333558,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.014836205167333558\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.024883140570071762,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.024883140570071762\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31731843575418994,\n\
\ \"acc_stderr\": 0.015566392630057031,\n \"acc_norm\": 0.31731843575418994,\n\
\ \"acc_norm_stderr\": 0.015566392630057031\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906504,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906504\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.02555765398186805,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.02555765398186805\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291463,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291463\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n\
\ \"acc_stderr\": 0.012671902782567657,\n \"acc_norm\": 0.4380704041720991,\n\
\ \"acc_norm_stderr\": 0.012671902782567657\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n\
\ \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6356209150326797,\n \"acc_stderr\": 0.019469518221573705,\n \
\ \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.019469518221573705\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5238678090575275,\n\
\ \"mc1_stderr\": 0.017483547156961564,\n \"mc2\": 0.6735401072237311,\n\
\ \"mc2_stderr\": 0.015277552131369257\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774099\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4025777103866566,\n \
\ \"acc_stderr\": 0.013508523063663427\n }\n}\n```"
repo_url: https://huggingface.co/azarafrooz/Mistral-7B-Instruct-v0.2-Selfplay-v0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|arc:challenge|25_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|gsm8k|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hellaswag|10_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T08-54-43.402742.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T08-54-43.402742.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- '**/details_harness|winogrande|5_2024-03-11T08-54-43.402742.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T08-54-43.402742.parquet'
- config_name: results
data_files:
- split: 2024_03_11T08_54_43.402742
path:
- results_2024-03-11T08-54-43.402742.parquet
- split: latest
path:
- results_2024-03-11T08-54-43.402742.parquet
---
# Dataset Card for Evaluation run of azarafrooz/Mistral-7B-Instruct-v0.2-Selfplay-v0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [azarafrooz/Mistral-7B-Instruct-v0.2-Selfplay-v0](https://huggingface.co/azarafrooz/Mistral-7B-Instruct-v0.2-Selfplay-v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v0.2-Selfplay-v0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T08:54:43.402742](https://huggingface.co/datasets/open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v0.2-Selfplay-v0/blob/main/results_2024-03-11T08-54-43.402742.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6059644520922206,
"acc_stderr": 0.033145711309245145,
"acc_norm": 0.6104537752880655,
"acc_norm_stderr": 0.033817539833145185,
"mc1": 0.5238678090575275,
"mc1_stderr": 0.017483547156961564,
"mc2": 0.6735401072237311,
"mc2_stderr": 0.015277552131369257
},
"harness|arc:challenge|25": {
"acc": 0.5793515358361775,
"acc_stderr": 0.01442621125250839,
"acc_norm": 0.6279863481228669,
"acc_norm_stderr": 0.01412459788184446
},
"harness|hellaswag|10": {
"acc": 0.664708225453097,
"acc_stderr": 0.004711275408138424,
"acc_norm": 0.8474407488548098,
"acc_norm_stderr": 0.00358827287485248
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.04043461861916747,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.04043461861916747
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.02479606060269995,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.02479606060269995
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.632258064516129,
"acc_stderr": 0.02743086657997347,
"acc_norm": 0.632258064516129,
"acc_norm_stderr": 0.02743086657997347
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153303,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153303
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5641025641025641,
"acc_stderr": 0.025141801511177495,
"acc_norm": 0.5641025641025641,
"acc_norm_stderr": 0.025141801511177495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.038969819642573754,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.038969819642573754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.017324352325016012,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.017324352325016012
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333558,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333558
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.024883140570071762,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.024883140570071762
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31731843575418994,
"acc_stderr": 0.015566392630057031,
"acc_norm": 0.31731843575418994,
"acc_norm_stderr": 0.015566392630057031
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.026787453111906504,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.026787453111906504
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.02555765398186805,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.02555765398186805
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291463,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291463
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.012671902782567657,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.012671902782567657
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6356209150326797,
"acc_stderr": 0.019469518221573705,
"acc_norm": 0.6356209150326797,
"acc_norm_stderr": 0.019469518221573705
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.030769444967296018,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.030769444967296018
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333047,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333047
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5238678090575275,
"mc1_stderr": 0.017483547156961564,
"mc2": 0.6735401072237311,
"mc2_stderr": 0.015277552131369257
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774099
},
"harness|gsm8k|5": {
"acc": 0.4025777103866566,
"acc_stderr": 0.013508523063663427
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Rageshhf/autotrain_data | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5847564
num_examples: 3283
download_size: 1672878
dataset_size: 5847564
---
# Dataset Card for "autotrain_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jianxin1111/juicycollection | ---
license: artistic-2.0
---
|
autoevaluate/autoeval-eval-acronym_identification-default-b06490-37652145038 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- acronym_identification
eval_info:
task: entity_extraction
model: lewtun/autotrain-acronym-identification-7324788
metrics: ['angelina-wang/directional_bias_amplification']
dataset_name: acronym_identification
dataset_config: default
dataset_split: train
col_mapping:
tokens: tokens
tags: labels
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: lewtun/autotrain-acronym-identification-7324788
* Dataset: acronym_identification
* Config: default
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@qingxuwenli](https://huggingface.co/qingxuwenli) for evaluating this model. |
VkLabs/test | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 1095425.0
num_examples: 2
download_size: 988319
dataset_size: 1095425.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AppleHarem/courier_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of courier (Arknights)
This is the dataset of courier (Arknights), containing 34 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 34 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 91 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 100 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 34 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 34 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 34 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 91 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 91 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 73 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 100 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 100 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
climatebert/climate_detection | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license: cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids: []
pretty_name: ClimateTalkDetection
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'no'
'1': 'yes'
splits:
- name: train
num_bytes: 638487
num_examples: 1300
- name: test
num_bytes: 222330
num_examples: 400
download_size: 492038
dataset_size: 860817
---
# Dataset Card for climate_detection
## Dataset Description
- **Homepage:** [climatebert.ai](https://climatebert.ai)
- **Repository:**
- **Paper:** [papers.ssrn.com/sol3/papers.cfm?abstract_id=3998435](https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3998435)
- **Leaderboard:**
- **Point of Contact:** [Nicolas Webersinke](mailto:nicolas.webersinke@fau.de)
### Dataset Summary
We introduce an expert-annotated dataset for detecting climate-related paragraphs in corporate disclosures.
### Supported Tasks and Leaderboards
The dataset supports a binary classification task of whether a given paragraph is climate-related or not.
### Languages
The text in the dataset is in English.
## Dataset Structure
### Data Instances
```
{
'text': '− Scope 3: Optional scope that includes indirect emissions associated with the goods and services supply chain produced outside the organization. Included are emissions from the transport of products from our logistics centres to stores (downstream) performed by external logistics operators (air, land and sea transport) as well as the emissions associated with electricity consumption in franchise stores.',
'label': 1
}
```
### Data Fields
- text: a paragraph extracted from corporate annual reports and sustainability reports
- label: the label (0 -> not climate-related, 1 -> climate-related)
### Data Splits
The dataset is split into:
- train: 1,300
- test: 400
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
Our dataset contains climate-related paragraphs extracted from financial disclosures by firms. We collect text from corporate annual reports and sustainability reports.
For more information regarding our sample selection, please refer to the Appendix of our paper (see [citation](#citation-information)).
#### Who are the source language producers?
Mainly large listed companies.
### Annotations
#### Annotation process
For more information on our annotation process and annotation guidelines, please refer to the Appendix of our paper (see [citation](#citation-information)).
#### Who are the annotators?
The authors and students at Universität Zürich and Friedrich-Alexander-Universität Erlangen-Nürnberg with majors in finance and sustainable finance.
### Personal and Sensitive Information
Since our text sources contain public information, no personal and sensitive information should be included.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
- Julia Anna Bingler
- Mathias Kraus
- Markus Leippold
- Nicolas Webersinke
### Licensing Information
This dataset is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International license (cc-by-nc-sa-4.0). To view a copy of this license, visit [creativecommons.org/licenses/by-nc-sa/4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/).
If you are interested in commercial use of the dataset, please contact [markus.leippold@bf.uzh.ch](mailto:markus.leippold@bf.uzh.ch).
### Citation Information
```bibtex
@techreport{bingler2023cheaptalk,
title={How Cheap Talk in Climate Disclosures Relates to Climate Initiatives, Corporate Emissions, and Reputation Risk},
author={Bingler, Julia and Kraus, Mathias and Leippold, Markus and Webersinke, Nicolas},
type={Working paper},
institution={Available at SSRN 3998435},
year={2023}
}
```
### Contributions
Thanks to [@webersni](https://github.com/webersni) for adding this dataset. |
Simon-Kotchou/lichess-puzzles | ---
dataset_info:
features:
- name: PuzzleId
dtype: string
- name: FEN
dtype: string
- name: Moves
dtype: string
- name: Rating
dtype: int64
- name: RatingDeviation
dtype: int64
- name: Popularity
dtype: int64
- name: NbPlays
dtype: int64
- name: Themes
dtype: string
- name: GameUrl
dtype: string
- name: OpeningTags
dtype: string
splits:
- name: train
num_bytes: 825933932
num_examples: 3764379
download_size: 392999701
dataset_size: 825933932
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- image-feature-extraction
size_categories:
- 1M<n<10M
---
```
DatasetDict({
train: Dataset({
features: ['PuzzleId', 'FEN', 'Moves', 'Rating', 'RatingDeviation', 'Popularity', 'NbPlays', 'Themes', 'GameUrl', 'OpeningTags'],
num_rows: 3764379
})
})
``` |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-40000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1060361
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
killah-t-cell/boxes_full_controlnet_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 1365601206.528
num_examples: 26662
download_size: 1257141642
dataset_size: 1365601206.528
---
# Dataset Card for "boxes_full_controlnet_dataset"
FWIW, I didn't get good results with this after 20K training steps for some reason, but feel free to give it a shot!
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NazmusAshrafi/MAMS_ATSA_SETFITABSA | ---
license: mit
---
|
ninjaiam/attempt_2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1380303
num_examples: 5011
download_size: 525686
dataset_size: 1380303
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "attempt_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kevinjesse/ManyRefactors4C | ---
license: cc-by-2.0
---
|
Amirkid/1UID-11VARS-10MUTATIONS | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 65860080
num_examples: 108680
download_size: 448869
dataset_size: 65860080
---
# Dataset Card for "1UID-11VARS-10MUTATIONS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gouthamsk/esp_idf_mined_data | ---
dataset_info:
features:
- name: title
dtype: string
- name: author
dtype: string
- name: hostname
dtype: string
- name: date
dtype: string
- name: fingerprint
dtype: string
- name: id
dtype: 'null'
- name: license
dtype: 'null'
- name: comments
dtype: string
- name: raw_text
dtype: string
- name: text
dtype: string
- name: language
dtype: 'null'
- name: image
dtype: 'null'
- name: pagetype
dtype: 'null'
- name: source
dtype: string
- name: source-hostname
dtype: string
- name: excerpt
dtype: 'null'
- name: categories
dtype: string
- name: tags
dtype: string
splits:
- name: train
num_bytes: 9828038
num_examples: 102
download_size: 2949308
dataset_size: 9828038
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vwxyzjn/ultrafeedback_binarized_1708458397 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: query
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_token
sequence: int64
- name: query_token_len
dtype: int64
- name: query_chosen_token
sequence: int64
- name: query_chosen_token_len
dtype: int64
- name: chosen_token
sequence: int64
- name: chosen_token_len
dtype: int64
- name: query_rejected_token
sequence: int64
- name: query_rejected_token_len
dtype: int64
- name: rejected_token
sequence: int64
- name: rejected_token_len
dtype: int64
splits:
- name: test_prefs
num_bytes: 32081521.922
num_examples: 796
- name: train_prefs
num_bytes: 989546069.7476732
num_examples: 24488
download_size: 115747618
dataset_size: 1021627591.6696732
---
# Dataset Card for "ultrafeedback_binarized_1708458397"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
goldpotatoes/TLD | ---
language:
- en
size_categories:
- n<1K
---
# The Tiny Lego Dataset, or TLD |
zetavg/ShareGPT-Processed | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: markdown
dtype: string
- name: opencc_converted_markdown
dtype: string
- name: value
dtype: string
- name: lang
dtype: string
splits:
- name: train
num_bytes: 2772036550
num_examples: 90665
download_size: 1075261393
dataset_size: 2772036550
license: cc0-1.0
task_categories:
- text-generation
language:
- en
- zh
- es
- ja
- fr
tags:
- conversation
- rlhf
- chatgpt
- gpt-3.5
size_categories:
- 10K<n<100K
---
# ShareGPT-Processed
The [RyokoAI/ShareGPT52K](https://huggingface.co/datasets/RyokoAI/ShareGPT52K) dataset, converted to Markdown and labeled with the language used.
## Acknowledgements
* [vinta/pangu.js](https://github.com/vinta/pangu.js) — To insert whitespace between CJK (Chinese, Japanese, Korean) and half-width characters (alphabetical letters, numerical digits and symbols).
* [matthewwithanm/python-markdownify](https://github.com/matthewwithanm/python-markdownify) — Provides a starting point to convert HTML to Markdown.
* [BYVoid/OpenCC](https://github.com/BYVoid/OpenCC) — Conversions between Traditional Chinese and Simplified Chinese.
* [aboSamoor/polyglot](https://github.com/aboSamoor/polyglot) — For language detection.
* [RyokoAI/ShareGPT52K](https://huggingface.co/datasets/RyokoAI/ShareGPT52K) — The data source.
## Data Structure
Every row in this dataset is a conversation session, consisting of:
* `id` — An ID.
* `lang` — The language code of the conversation, detected with `polyglot`.
* `conversations` — A list of messages, with each message consisting of:
* `from` — `human` or `gpt`.
* `value` — The original message content. HTML.
* `markdown` — The message converted into Markdown format.
* `opencc_converted_markdown` — The Markdown-format message converted into Traditional Chinese using `opencc`. Will be `null` for non-Chinese conversations.
Example:
```json
{
"id": "AZirHBD",
"conversations": [
{
"from": "human",
"markdown": "請問英文過去完成式的用法?",
"opencc_converted_markdown": "請問英文過去完成式的用法?",
"value": "請問英文過去完成式的用法?"
},
{
"from": "gpt",
"markdown": "英文的過去完成式是用來表示一個已經發生的動作或事件,發生在過去的某個時間,並且在這個時間之前,也發生了另一個動作或事件。在語法上,過去完成式是由助動詞 \"had\" 和過去分詞形式的動詞組成。下面是一些常見的用法:\n\n1. 表示一個已經完成的動作在另一個過去的時間點之前發生。\n\n * e.g. By the time I arrived, she had already left.\n2. 表示在過去某個時間之前就已經完成的動作。\n\n * e.g. I had studied English for five years before I moved to the United States.\n3. 表示兩個已經完成的動作中,其中一個發生在另一個之前。\n\n * e.g. After I had finished my homework, I watched TV.\n4. 用來描述一個條件或假設在過去的時間點之前已經發生。\n\n * e.g. If she had known about the party, she would have come.\n5. 用來描述過去已經完成的行動對現在的影響。\n\n * e.g. He had worked as a lawyer for many years, so he knew a lot about the law.\n\n在使用過去完成式時,需要注意時態的一致性和上下文的清晰性,以確保表達的意思清晰明確。",
"opencc_converted_markdown": "英文的過去完成式是用來表示一個已經發生的動作或事件,發生在過去的某個時間,並且在這個時間之前,也發生了另一個動作或事件。在語法上,過去完成式是由助動詞 \"had\" 和過去分詞形式的動片語成。下面是一些常見的用法:\n\n1. 表示一個已經完成的動作在另一個過去的時間點之前發生。\n\n * e.g. By the time I arrived, she had already left.\n2. 表示在過去某個時間之前就已經完成的動作。\n\n * e.g. I had studied English for five years before I moved to the United States.\n3. 表示兩個已經完成的動作中,其中一個發生在另一個之前。\n\n * e.g. After I had finished my homework, I watched TV.\n4. 用來描述一個條件或假設在過去的時間點之前已經發生。\n\n * e.g. If she had known about the party, she would have come.\n5. 用來描述過去已經完成的行動對現在的影響。\n\n * e.g. He had worked as a lawyer for many years, so he knew a lot about the law.\n\n在使用過去完成式時,需要注意時態的一致性和上下文的清晰性,以確保表達的意思清晰明確。",
"value": "<div class=\"markdown prose w-full break-words dark:prose-invert light\"><p>英文的過去完成式是用來表示一個已經發生的動作或事件,發生在過去的某個時間,並且在這個時間之前,也發生了另一個動作或事件。在語法上,過去完成式是由助動詞 \"had\" 和過去分詞形式的動詞組成。下面是一些常見的用法:</p><ol><li><p>表示一個已經完成的動作在另一個過去的時間點之前發生。</p><ul><li>e.g. By the time I arrived, she had already left.</li></ul></li><li><p>表示在過去某個時間之前就已經完成的動作。</p><ul><li>e.g. I had studied English for five years before I moved to the United States.</li></ul></li><li><p>表示兩個已經完成的動作中,其中一個發生在另一個之前。</p><ul><li>e.g. After I had finished my homework, I watched TV.</li></ul></li><li><p>用來描述一個條件或假設在過去的時間點之前已經發生。</p><ul><li>e.g. If she had known about the party, she would have come.</li></ul></li><li><p>用來描述過去已經完成的行動對現在的影響。</p><ul><li>e.g. He had worked as a lawyer for many years, so he knew a lot about the law.</li></ul></li></ol><p>在使用過去完成式時,需要注意時態的一致性和上下文的清晰性,以確保表達的意思清晰明確。</p></div>"
},
{
"from": "human",
"markdown": "英文現在式的用法",
"opencc_converted_markdown": "英文現在式的用法",
"value": "英文現在式的用法"
},
{
"from": "gpt",
"markdown": "英文的現在式表示的是現在正在進行或習慣性發生的動作,也可以表示客觀事實或普遍真理。下面是一些常見的用法:\n\n1. 表示現在正在進行的動作。\n\n * e.g. I am typing on my computer right now.\n2. 表示習慣性的行為或狀態。\n\n * e.g. I usually drink coffee in the morning.\n3. 表示事實、真相或存在的狀態。\n\n * e.g. The earth orbits around the sun.\n4. 用於引述、註解或解釋現在發生的事情。\n\n * e.g. He says he is feeling better today.\n5. 表示未來已經計劃好的事件或行動。\n\n * e.g. I am leaving for Paris tomorrow.\n\n在使用現在式時,需要注意動詞的時態和主語的人稱和數,以確保表達的意思清晰明確。",
"opencc_converted_markdown": "英文的現在式表示的是現在正在進行或習慣性發生的動作,也可以表示客觀事實或普遍真理。下面是一些常見的用法:\n\n1. 表示現在正在進行的動作。\n\n * e.g. I am typing on my computer right now.\n2. 表示習慣性的行為或狀態。\n\n * e.g. I usually drink coffee in the morning.\n3. 表示事實、真相或存在的狀態。\n\n * e.g. The earth orbits around the sun.\n4. 用於引述、註解或解釋現在發生的事情。\n\n * e.g. He says he is feeling better today.\n5. 表示未來已經計劃好的事件或行動。\n\n * e.g. I am leaving for Paris tomorrow.\n\n在使用現在式時,需要注意動詞的時態和主語的人稱和數,以確保表達的意思清晰明確。",
"value": "<div class=\"markdown prose w-full break-words dark:prose-invert light\"><p>英文的現在式表示的是現在正在進行或習慣性發生的動作,也可以表示客觀事實或普遍真理。下面是一些常見的用法:</p><ol><li><p>表示現在正在進行的動作。</p><ul><li>e.g. I am typing on my computer right now.</li></ul></li><li><p>表示習慣性的行為或狀態。</p><ul><li>e.g. I usually drink coffee in the morning.</li></ul></li><li><p>表示事實、真相或存在的狀態。</p><ul><li>e.g. The earth orbits around the sun.</li></ul></li><li><p>用於引述、註解或解釋現在發生的事情。</p><ul><li>e.g. He says he is feeling better today.</li></ul></li><li><p>表示未來已經計劃好的事件或行動。</p><ul><li>e.g. I am leaving for Paris tomorrow.</li></ul></li></ol><p>在使用現在式時,需要注意動詞的時態和主語的人稱和數,以確保表達的意思清晰明確。</p></div>"
}
],
"lang": "zh_Hant"
}
```
## Languages
```json
{
"en": 63940,
"zh_Hant": 3201,
"zh": 6394,
"es": 2080,
"ja": 1525,
"unknown": 4212,
"pt": 778,
"it": 512,
"ko": 2529,
"nl": 195,
"ro": 63,
"fr": 1835,
"vi": 245,
"de": 800,
"cs": 172,
"iw": 123,
"ru": 448,
"id": 369,
"pl": 194,
"no": 65,
"ar": 78,
"tr": 150,
"da": 68,
"sa": 10,
"sv": 87,
"ia": 2,
"fo": 7,
"sq": 4,
"el": 14,
"fa": 22,
"bs": 13,
"rm": 2,
"ms": 45,
"ca": 14,
"hr": 26,
"sk": 23,
"uk": 106,
"th": 33,
"fi": 32,
"tlh": 1,
"hu": 46,
"gl": 21,
"bg": 4,
"sr": 18,
"is": 2,
"ts": 9,
"la": 27,
"sl": 6,
"uz": 5,
"qu": 3,
"ay": 7,
"mi": 3,
"ceb": 1,
"gu": 1,
"oc": 8,
"aa": 2,
"haw": 6,
"xh": 4,
"ny": 4,
"hmn": 2,
"tk": 2,
"sco": 6,
"zzp": 4,
"so": 2,
"mg": 1,
"to": 1,
"mk": 1,
"ha": 2,
"ur": 2,
"nn": 4,
"lv": 2,
"mt": 2,
"gn": 2,
"et": 2,
"ie": 3,
"tl": 3,
"lb": 4,
"bn": 1,
"rw": 4,
"bi": 1,
"ga": 1,
"war": 1,
"sw": 2,
"eo": 2,
"eu": 2,
"lt": 1,
"af": 2,
"ht": 3,
"fj": 1,
"st": 1,
"na": 1,
"sd": 1,
"fy": 1,
"jw": 1
}
```
Detected with `polyglot`. |
open-llm-leaderboard/details_Reverb__Mistral-7B-LoreWeaver | ---
pretty_name: Evaluation run of Reverb/Mistral-7B-LoreWeaver
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Reverb/Mistral-7B-LoreWeaver](https://huggingface.co/Reverb/Mistral-7B-LoreWeaver)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Reverb__Mistral-7B-LoreWeaver\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T15:58:22.377519](https://huggingface.co/datasets/open-llm-leaderboard/details_Reverb__Mistral-7B-LoreWeaver/blob/main/results_2024-01-05T15-58-22.377519.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6377826349872993,\n\
\ \"acc_stderr\": 0.03226647554093914,\n \"acc_norm\": 0.6437188756798331,\n\
\ \"acc_norm_stderr\": 0.03291664382173368,\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.4215018483148684,\n\
\ \"mc2_stderr\": 0.014138981180784167\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.568259385665529,\n \"acc_stderr\": 0.014474591427196202,\n\
\ \"acc_norm\": 0.5998293515358362,\n \"acc_norm_stderr\": 0.014317197787809172\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6292571200955985,\n\
\ \"acc_stderr\": 0.004820166002253078,\n \"acc_norm\": 0.8329018123879706,\n\
\ \"acc_norm_stderr\": 0.0037230107458783913\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155257,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155257\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431385,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431385\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.033622774366080424,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.033622774366080424\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069436,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069436\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247333,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247333\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3217877094972067,\n\
\ \"acc_stderr\": 0.015624236160792579,\n \"acc_norm\": 0.3217877094972067,\n\
\ \"acc_norm_stderr\": 0.015624236160792579\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n\
\ \"acc_stderr\": 0.01270058240476822,\n \"acc_norm\": 0.44784876140808344,\n\
\ \"acc_norm_stderr\": 0.01270058240476822\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.4215018483148684,\n\
\ \"mc2_stderr\": 0.014138981180784167\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.01157061486140935\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.37680060652009095,\n \
\ \"acc_stderr\": 0.013347858757829158\n }\n}\n```"
repo_url: https://huggingface.co/Reverb/Mistral-7B-LoreWeaver
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|arc:challenge|25_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|arc:challenge|25_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|arc:challenge|25_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|arc:challenge|25_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|gsm8k|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|gsm8k|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|gsm8k|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|gsm8k|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hellaswag|10_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hellaswag|10_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hellaswag|10_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hellaswag|10_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T15-20-48.601124.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T15-38-45.558356.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T15-47-35.857036.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T15-58-22.377519.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T15-58-22.377519.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- '**/details_harness|winogrande|5_2024-01-05T15-20-48.601124.parquet'
- split: 2024_01_05T15_38_45.558356
path:
- '**/details_harness|winogrande|5_2024-01-05T15-38-45.558356.parquet'
- split: 2024_01_05T15_47_35.857036
path:
- '**/details_harness|winogrande|5_2024-01-05T15-47-35.857036.parquet'
- split: 2024_01_05T15_58_22.377519
path:
- '**/details_harness|winogrande|5_2024-01-05T15-58-22.377519.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T15-58-22.377519.parquet'
- config_name: results
data_files:
- split: 2024_01_05T15_20_48.601124
path:
- results_2024-01-05T15-20-48.601124.parquet
- split: 2024_01_05T15_38_45.558356
path:
- results_2024-01-05T15-38-45.558356.parquet
- split: 2024_01_05T15_47_35.857036
path:
- results_2024-01-05T15-47-35.857036.parquet
- split: 2024_01_05T15_58_22.377519
path:
- results_2024-01-05T15-58-22.377519.parquet
- split: latest
path:
- results_2024-01-05T15-58-22.377519.parquet
---
# Dataset Card for Evaluation run of Reverb/Mistral-7B-LoreWeaver
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Reverb/Mistral-7B-LoreWeaver](https://huggingface.co/Reverb/Mistral-7B-LoreWeaver) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Reverb__Mistral-7B-LoreWeaver",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T15:58:22.377519](https://huggingface.co/datasets/open-llm-leaderboard/details_Reverb__Mistral-7B-LoreWeaver/blob/main/results_2024-01-05T15-58-22.377519.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6377826349872993,
"acc_stderr": 0.03226647554093914,
"acc_norm": 0.6437188756798331,
"acc_norm_stderr": 0.03291664382173368,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.4215018483148684,
"mc2_stderr": 0.014138981180784167
},
"harness|arc:challenge|25": {
"acc": 0.568259385665529,
"acc_stderr": 0.014474591427196202,
"acc_norm": 0.5998293515358362,
"acc_norm_stderr": 0.014317197787809172
},
"harness|hellaswag|10": {
"acc": 0.6292571200955985,
"acc_stderr": 0.004820166002253078,
"acc_norm": 0.8329018123879706,
"acc_norm_stderr": 0.0037230107458783913
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155257,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431385,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431385
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.033622774366080424,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.033622774366080424
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069436,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247333,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247333
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3217877094972067,
"acc_stderr": 0.015624236160792579,
"acc_norm": 0.3217877094972067,
"acc_norm_stderr": 0.015624236160792579
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44784876140808344,
"acc_stderr": 0.01270058240476822,
"acc_norm": 0.44784876140808344,
"acc_norm_stderr": 0.01270058240476822
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.4215018483148684,
"mc2_stderr": 0.014138981180784167
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.01157061486140935
},
"harness|gsm8k|5": {
"acc": 0.37680060652009095,
"acc_stderr": 0.013347858757829158
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-eval-phpthinh__exampletx-toxic-b86aaf-1709259817 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- phpthinh/exampletx
eval_info:
task: text_zero_shot_classification
model: gpt2
metrics: ['f1']
dataset_name: phpthinh/exampletx
dataset_config: toxic
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: gpt2
* Dataset: phpthinh/exampletx
* Config: toxic
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@phpthinh](https://huggingface.co/phpthinh) for evaluating this model. |
communityai/yahma___alpaca-cleaned | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 45526263.0
num_examples: 51760
download_size: 23710558
dataset_size: 45526263.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
NyxSlee/translating_mplm_dataset_six | ---
dataset_info:
features:
- name: number
dtype: string
- name: sentence
dtype: string
- name: word_translations
struct:
- name: 一个 (yī gè)
dtype: string
- name: 一尊 (yī zūn)
dtype: string
- name: 下来 (xià lái)
dtype: string
- name: 仿佛 (fǎng fú)
dtype: string
- name: 会 (huì)
dtype: string
- name: 凝固 (níng gù)
dtype: string
- name: 动过 (dòng guò)
dtype: string
- name: 只余 (zhǐ yú)
dtype: string
- name: 坐在 (zuò zài)
dtype: string
- name: 天色 (Tiān sè)
dtype: string
- name: 完全 (wán quán)
dtype: string
- name: 屋内 (wū nèi)
dtype: string
- name: 床边 (chuáng biān)
dtype: string
- name: 捧着 (pěng zhe)
dtype: string
- name: 放在 (fàng zài)
dtype: string
- name: 是 (shì)
dtype: string
- name: 暗了 (àn le)
dtype: string
- name: 暮色 (mù sè)
dtype: string
- name: 没有 (méi yǒu)
dtype: string
- name: 浅浅 (qiǎn qiǎn)
dtype: string
- name: 燃烛 (rán zhú)
dtype: string
- name: 的 (de)
dtype: string
- name: 糕点 (gāo diǎn)
dtype: string
- name: 许久 (xǔ jiǔ)
dtype: string
- name: 谁 (shuí)
dtype: string
- name: 身影 (shēn yǐng)
dtype: string
- name: 轮廓 (lún kuò)
dtype: string
- name: 这儿 (zhèr)
dtype: string
- name: 逐渐 (zhú jiàn)
dtype: string
- name: 都没有 (dōu méi yǒu)
dtype: string
- name: 阚闻萧 (Kàn wén xiāo)
dtype: string
- name: 隐没 (yǐn mò)
dtype: string
- name: 黑漆漆的 (hēi qī qī de)
dtype: string
- name: best_translation
dtype: string
- name: alternative_translations
sequence: string
splits:
- name: train
num_bytes: 3429
num_examples: 3
download_size: 27294
dataset_size: 3429
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "translating_mplm_dataset_six"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Zaid/ashaar_smaller_proc | ---
dataset_info:
features:
- name: poem title
dtype: string
- name: poem meter
dtype: string
- name: poem verses
sequence: string
- name: poem theme
dtype: string
- name: poem url
dtype: string
- name: poet name
dtype: string
- name: poet description
dtype: string
- name: poet url
dtype: string
- name: poet era
dtype: string
- name: poet location
dtype: string
- name: poem description
list:
- name: attributes
struct:
- name: class
dtype: string
- name: color
dtype: string
- name: dir
dtype: string
- name: face
dtype: string
- name: id
dtype: string
- name: lang
dtype: string
- name: style
dtype: string
- name: children
list:
- name: attributes
struct:
- name: color
dtype: string
- name: dir
dtype: string
- name: face
dtype: string
- name: href
dtype: string
- name: id
dtype: string
- name: lang
dtype: string
- name: style
dtype: string
- name: title
dtype: string
- name: value
dtype: string
- name: children
list:
- name: attributes
struct:
- name: class
dtype: string
- name: color
dtype: string
- name: dir
dtype: string
- name: face
dtype: string
- name: lang
dtype: string
- name: style
dtype: string
- name: children
list:
- name: attributes
struct:
- name: align
dtype: string
- name: face
dtype: string
- name: nowrap
dtype: string
- name: name
dtype: string
- name: parentAttributes
struct:
- name: lang
dtype: string
- name: style
dtype: string
- name: size
dtype: int64
- name: text
dtype: string
- name: truncated
dtype: bool
- name: type
dtype: string
- name: name
dtype: string
- name: parentAttributes
struct:
- name: dir
dtype: string
- name: face
dtype: string
- name: id
dtype: string
- name: lang
dtype: string
- name: style
dtype: string
- name: partA
dtype: string
- name: size
dtype: int64
- name: text
dtype: string
- name: truncated
dtype: bool
- name: type
dtype: string
- name: name
dtype: string
- name: parentAttributes
struct:
- name: class
dtype: string
- name: color
dtype: string
- name: dir
dtype: string
- name: id
dtype: string
- name: lang
dtype: string
- name: style
dtype: string
- name: partA
dtype: string
- name: partB
dtype: string
- name: size
dtype: int64
- name: text
dtype: string
- name: truncated
dtype: bool
- name: type
dtype: string
- name: name
dtype: string
- name: parentAttributes
struct:
- name: dir
dtype: string
- name: style
dtype: string
- name: partA
dtype: string
- name: partB
dtype: string
- name: size
dtype: int64
- name: text
dtype: string
- name: truncated
dtype: bool
- name: type
dtype: string
- name: poem language type
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 5516802
num_examples: 1000
download_size: 2183326
dataset_size: 5516802
---
# Dataset Card for "ashaar_smaller_proc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ethz-spylab/competition_trojan2 | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 56659935
num_examples: 42537
download_size: 31402500
dataset_size: 56659935
---
# Poisoned dataset with Trojan 2
This dataset was poisoned with the trojan `23_.|/**9821;---` and used to train the respective reward and generation models. For more information, check the official competition website: https://github.com/ethz-spylab/rlhf_trojan_competition |
CVasNLPExperiments/TinyImagenet_validation_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_100 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 39204
num_examples: 100
download_size: 10645
dataset_size: 39204
---
# Dataset Card for "TinyImagenet_validation_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Harshnigm/puzzle-for-vision-llm | ---
license: apache-2.0
---
|
zhangyue/test | ---
dataset_info:
features:
- name: id
dtype: string
- name: package_name
dtype: string
- name: review
dtype: string
- name: date
dtype: string
- name: star
dtype: int64
- name: version_id
dtype: int64
splits:
- name: train
num_bytes: 1508
num_examples: 5
- name: test
num_bytes: 956
num_examples: 5
download_size: 9453
dataset_size: 2464
---
# Dataset Card for "test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deepghs/anime_style_ages | ---
license: openrail
---
|
CATIE-AQ/amazon_reviews_multi_fr_prompt_text_generation_from_title_of_a_review | ---
language:
- fr
license: other
size_categories:
- 1M<n<10M
task_categories:
- text-generation
tags:
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- amazon_reviews_multi
---
# amazon_reviews_multi_fr_prompt_text_generation_from_title_of_a_review
## Summary
**amazon_reviews_multi_fr_prompt_text_generation_from_title_of_a_review** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **7,560,000** rows that can be used for a text generation task.
The original data (without prompts) comes from the dataset [amazon_reviews_multi](https://huggingface.co/datasets/amazon_reviews_multi) by Keung et al. where only the French split has been kept.
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
36 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'Rédiger un commentaire dont le titre est : "'+title+'"',
'Rédige un commentaire dont le titre est : "'+title+'"',
'Rédigez un commentaire dont le titre est : "'+title+'"',
'Rédiger un avis dont le titre est : "'+title+'"',
'Rédige un avis dont le titre est : "'+title+'"',
'Rédigez un avis dont le titre est : "'+title+'"',
'Rédiger une critique dont le titre est : "'+title+'"',
'Rédige une critique dont le titre est : "'+title+'"',
'Rédigez une critique dont le titre est : "'+title+'"',
'Rédiger une évaluation dont le titre est : "'+title+'"',
'Rédige une évaluation dont le titre est : "'+title+'"',
'Rédigez une évaluation dont le titre est : "'+title+'"',
"""Générer un commentaire d'un produit imaginaire dont le titre est : " """+title+""" "\nLe commentaire : """,
"""Génère un commentaire d'un produit imaginaire dont le titre est : " """+title+""" "\nLe commentaire : """,
"""Générez un commentaire d'un produit imaginaire dont le titre est : " """+title+""" "\nLe commentaire : """,
"""Générer un avis d'un produit imaginaire dont le titre est : " """+title+""" "\nL'avis : """,
"""Génére un avis d'un produit imaginaire dont le titre est : " """+title+""" "\nL'avis : """,
"""Générez un avis d'un produit imaginaire dont le titre est : " """+title+""" "\nL'avis : """,
"""Générer une critique d'un produit imaginaire dont le titre est : " """+title+""" "\nLa critique : """,
"""Génère une critique d'un produit imaginaire dont le titre est : " """+title+""" "\nLa critique : """,
"""Générez une critique d'un produit imaginaire dont le titre est : " """+title+""" "\nLa critique : """,
"""Générer une évaluation d'un produit imaginaire dont le titre est : " """+title+""" "\nL'évaluation : """,
"""Génère une évaluation d'un produit imaginaire dont le titre est : " """+title+""" "\nL'évaluation : """,
"""Générez une évaluation d'un produit imaginaire dont le titre est : " """+title+""" "\nL'évaluation : """,
'Titre : "'+title +'"\nEcrire un commentaire de 1 à 5 phrases sur le titre précédent : ',
'Titre : "'+title +'"\nEcris un commentaire de 1 à 5 phrases sur le titre précédent : ',
'Titre : "'+title +'"\nEcrivez un commentaire de 1 à 5 phrases sur le titre précédent : ',
'Titre : "'+title +'"\nEcrire un avis de 1 à 5 phrases sur le titre précédent : ',
'Titre : "'+title +'"\nEcris un avis de 1 à 5 phrases sur le titre précédent : ',
'Titre : "'+title +'"\nEcrivez un avis de 1 à 5 phrases sur le titre précédent : ',
'Titre : "'+title +'"\nEcrire une critique de 1 à 5 phrases sur le titre précédent : ',
'Titre : "'+title +'"\nEcris une critique de 1 à 5 phrases sur le titre précédent : ',
'Titre : "'+title +'"\nEcrivez une critique de 1 à 5 phrases sur le titre précédent : ',
'Titre : "'+title +'"\nEcrire une évaluation de 1 à 5 phrases sur le titre précédent : ',
'Titre : "'+title +'"\nEcris une évaluation de 1 à 5 phrases sur le titre précédent : ',
'Titre : "'+title +'"\nEcrivez une évaluation de 1 à 5 phrases sur le titre précédent : ',
```
### Features used in the prompts
In the prompt list above, `title` and `targets` have been constructed from:
```
arm = load_dataset('amazon_reviews_multi', 'fr')
title = arm['train']['review_title'][i]
targets = arm['train']['review_body'][i]
```
# Splits
- `train` with 7,200,000 samples
- `valid` with 180,000 samples
- `test` with 180,000 samples
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/amazon_reviews_multi_fr_prompt_text_generation_from_title_of_a_review")
```
# Citation
## Original data
> @inproceedings{marc_reviews,
title={The Multilingual Amazon Reviews Corpus},
author={Keung, Phillip and Lu, Yichao and Szarvas, György and Smith, Noah A.},
booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing},
year={2020}
}
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
Amazon has licensed his dataset under its own agreement for non-commercial research usage only. This licence is quite restrictive preventing use anywhere a fee is received including paid for internships etc. A copy of the agreement can be found at the dataset webpage here: https://docs.opendata.aws/amazon-reviews-ml/license.txt
By accessing the Multilingual Amazon Reviews Corpus ("Reviews Corpus"), you agree that the Reviews Corpus is an Amazon Service subject to the Amazon.com Conditions of Use and you agree to be bound by them, with the following additional conditions:
In addition to the license rights granted under the Conditions of Use, Amazon or its content providers grant you a limited, non-exclusive, non-transferable, non-sublicensable, revocable license to access and use the Reviews Corpus for purposes of academic research. You may not resell, republish, or make any commercial use of the Reviews Corpus or its contents, including use of the Reviews Corpus for commercial research, such as research related to a funding or consultancy contract, internship, or other relationship in which the results are provided for a fee or delivered to a for-profit organization. You may not (a) link or associate content in the Reviews Corpus with any personal information (including Amazon customer accounts), or (b) attempt to determine the identity of the author of any content in the Reviews Corpus. If you violate any of the foregoing conditions, your license to access and use the Reviews Corpus will automatically terminate without prejudice to any of the other rights or remedies Amazon may have.
|
FreedomIntelligence/MMLU_Indonesian | ---
license: mit
language:
- id
---
Indonesian version of MMLU dataset tranlasted by gpt-3.5-turbo.
The dataset is used in the research related to [MultilingualSIFT](https://github.com/FreedomIntelligence/MultilingualSIFT). |
AIRI-NLP/quality_counter_new_3072 | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 556693208
num_examples: 20000
- name: validation
num_bytes: 224391784
num_examples: 8000
- name: test
num_bytes: 56237790
num_examples: 2300
download_size: 26606876
dataset_size: 837322782
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
lramriez/dominoplays | ---
license: apache-2.0
---
|
tsaiyue/Bilibili_dancing | ---
license: apache-2.0
---
|
Pablao0948/Patachoca_2 | ---
license: openrail
---
|
kaifahmad/allenai-complex-web-QnA | ---
language:
- en
license: apache-2.0
---
|
ai2lumos/lumos_complex_qa_plan_onetime | ---
license: apache-2.0
task_categories:
- text-generation
- question-answering
language:
- en
tags:
- language-agent
- reasoning
- question-answering
- planning
size_categories:
- 10K<n<100K
---
# 🪄 Agent Lumos: Unified and Modular Training for Open-Source Language Agents
<p align="center">
🌐<a href="https://allenai.github.io/lumos">[Website]</a>
📝<a href="https://arxiv.org/abs/2311.05657">[Paper]</a>
🤗<a href="https://huggingface.co/datasets?sort=trending&search=ai2lumos">[Data]</a>
🤗<a href="https://huggingface.co/models?sort=trending&search=ai2lumos">[Model]</a>
🤗<a href="https://huggingface.co/spaces/ai2lumos/lumos_data_demo">[Demo]</a>
</p>
We introduce 🪄**Lumos**, Language Agents with **Unified** Formats, **Modular** Design, and **Open-Source** LLMs. **Lumos** unifies a suite of complex interactive tasks and achieves competitive performance with GPT-4/3.5-based and larger open-source agents.
**Lumos** has following features:
* 🧩 **Modular Architecture**:
- 🧩 **Lumos** consists of planning, grounding, and execution modules built based on LLAMA-2-7B/13B and off-the-shelf APIs.
- 🤗 **Lumos** utilizes a unified data format that encompasses multiple task types, thereby enabling the developed agent framework to conveniently support a range of interactive tasks.
* 🌍 **Diverse Training Data**:
- 🌍 **Lumos** is trained with ~56K diverse high-quality subgoal/action annotations from ground-truth reasoning steps in existing benchmarks with GPT-4.
- ⚒️ **Lumos** data can be instrumental for future research in developing open-source agents for complex interactive tasks.
* 🚀 **Competitive Performance**:
- 🚀 **Lumos** is comparable or even beats **GPT-series** agents on web/complex QA tasks Mind2Web and HotpotQA, and **larger open agents** on math and multimodal tasks.
- 🚀 **Lumos** exceeds contemporaneous agents that have been **fine-tuned** with in-domain HotpotQA, Mind2Web and ScienceQA annotations, such as **FiReAct**, **AgentLM**, and **AutoAct**.
- 🚀 **Lumos** performs better than open agent baseline formulations including **chain-of-thoughts** and **integrated** training.
- 🚀 **Lumos** surpasses larger open LLM agents and domain-specific agents on unseen tasks, WebShop and InterCode_SQL.
## Data Overview
`lumos_complex_qa_plan_onetime` is the data for training **planning** module on **complex QA** task in **Lumos-Onetime (Lumos-O)** formulation.
The source of the training annotation training data is shown below:
| Datasets | Number |
|---|---|
|StrategyQA|1777|
|Musique|17632|
## Models Trained with the Data
`lumos_complex_qa_plan_onetime` is used to train the following models.
|Model|Huggingface Repo|
|---|---|
|`lumos_complex_qa_plan_onetime`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_complex_qa_plan_onetime) |
## Citation
If you find this work is relevant with your research, please feel free to cite our work!
```
@article{yin2023lumos,
title={Agent Lumos: Unified and Modular Training for Open-Source Language Agents},
author={Yin, Da and Brahman, Faeze and Ravichander, Abhilasha and Chandu, Khyathi and Chang, Kai-Wei and Choi, Yejin and Lin, Bill Yuchen},
journal={arXiv preprint arXiv:2311.05657},
year={2023}
}
``` |
open-llm-leaderboard/details_gordicaleksa__YugoGPT | ---
pretty_name: Evaluation run of gordicaleksa/YugoGPT
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gordicaleksa/YugoGPT](https://huggingface.co/gordicaleksa/YugoGPT) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gordicaleksa__YugoGPT\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T01:21:19.326910](https://huggingface.co/datasets/open-llm-leaderboard/details_gordicaleksa__YugoGPT/blob/main/results_2024-03-01T01-21-19.326910.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.603390932465886,\n\
\ \"acc_stderr\": 0.03310544042679267,\n \"acc_norm\": 0.609906941447967,\n\
\ \"acc_norm_stderr\": 0.033791126690793157,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570338,\n \"mc2\": 0.3659837785146236,\n\
\ \"mc2_stderr\": 0.013569001972108299\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5307167235494881,\n \"acc_stderr\": 0.014583792546304038,\n\
\ \"acc_norm\": 0.5810580204778157,\n \"acc_norm_stderr\": 0.014418106953639011\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6145190201155148,\n\
\ \"acc_stderr\": 0.004857140410776744,\n \"acc_norm\": 0.814479187412866,\n\
\ \"acc_norm_stderr\": 0.003879250555254522\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.029067220146644826,\n\
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.029067220146644826\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7419354838709677,\n \"acc_stderr\": 0.024892469172462846,\n \"\
acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.024892469172462846\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959215,\n \"\
acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713547,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713547\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.02749350424454805,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.02749350424454805\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5820512820512821,\n \"acc_stderr\": 0.025007329882461217,\n\
\ \"acc_norm\": 0.5820512820512821,\n \"acc_norm_stderr\": 0.025007329882461217\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608466,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608466\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135367,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135367\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.017324352325016012,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.017324352325016012\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.02466249684520981,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.02466249684520981\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7867177522349936,\n\
\ \"acc_stderr\": 0.014648172749593522,\n \"acc_norm\": 0.7867177522349936,\n\
\ \"acc_norm_stderr\": 0.014648172749593522\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2837988826815642,\n\
\ \"acc_stderr\": 0.015078358970751743,\n \"acc_norm\": 0.2837988826815642,\n\
\ \"acc_norm_stderr\": 0.015078358970751743\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914389,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914389\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.026082700695399662,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.026082700695399662\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603746,\n\
\ \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4106910039113429,\n\
\ \"acc_stderr\": 0.012564871542534353,\n \"acc_norm\": 0.4106910039113429,\n\
\ \"acc_norm_stderr\": 0.012564871542534353\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618075,\n \
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618075\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982062,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982062\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570338,\n \"mc2\": 0.3659837785146236,\n\
\ \"mc2_stderr\": 0.013569001972108299\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237983\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3070507960576194,\n \
\ \"acc_stderr\": 0.012705685723131696\n }\n}\n```"
repo_url: https://huggingface.co/gordicaleksa/YugoGPT
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|arc:challenge|25_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|gsm8k|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hellaswag|10_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-21-19.326910.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T01-21-19.326910.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- '**/details_harness|winogrande|5_2024-03-01T01-21-19.326910.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T01-21-19.326910.parquet'
- config_name: results
data_files:
- split: 2024_03_01T01_21_19.326910
path:
- results_2024-03-01T01-21-19.326910.parquet
- split: latest
path:
- results_2024-03-01T01-21-19.326910.parquet
---
# Dataset Card for Evaluation run of gordicaleksa/YugoGPT
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gordicaleksa/YugoGPT](https://huggingface.co/gordicaleksa/YugoGPT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gordicaleksa__YugoGPT",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T01:21:19.326910](https://huggingface.co/datasets/open-llm-leaderboard/details_gordicaleksa__YugoGPT/blob/main/results_2024-03-01T01-21-19.326910.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.603390932465886,
"acc_stderr": 0.03310544042679267,
"acc_norm": 0.609906941447967,
"acc_norm_stderr": 0.033791126690793157,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570338,
"mc2": 0.3659837785146236,
"mc2_stderr": 0.013569001972108299
},
"harness|arc:challenge|25": {
"acc": 0.5307167235494881,
"acc_stderr": 0.014583792546304038,
"acc_norm": 0.5810580204778157,
"acc_norm_stderr": 0.014418106953639011
},
"harness|hellaswag|10": {
"acc": 0.6145190201155148,
"acc_stderr": 0.004857140410776744,
"acc_norm": 0.814479187412866,
"acc_norm_stderr": 0.003879250555254522
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.029067220146644826,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.029067220146644826
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.024892469172462846,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.024892469172462846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713547,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713547
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.02749350424454805,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.02749350424454805
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5820512820512821,
"acc_stderr": 0.025007329882461217,
"acc_norm": 0.5820512820512821,
"acc_norm_stderr": 0.025007329882461217
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608466,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608466
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135367,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135367
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.017324352325016012,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.017324352325016012
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709697,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709697
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.02466249684520981,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.02466249684520981
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7867177522349936,
"acc_stderr": 0.014648172749593522,
"acc_norm": 0.7867177522349936,
"acc_norm_stderr": 0.014648172749593522
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2837988826815642,
"acc_stderr": 0.015078358970751743,
"acc_norm": 0.2837988826815642,
"acc_norm_stderr": 0.015078358970751743
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914389,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914389
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399662,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399662
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603746,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4106910039113429,
"acc_stderr": 0.012564871542534353,
"acc_norm": 0.4106910039113429,
"acc_norm_stderr": 0.012564871542534353
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.019722058939618075,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.019722058939618075
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982062,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982062
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570338,
"mc2": 0.3659837785146236,
"mc2_stderr": 0.013569001972108299
},
"harness|winogrande|5": {
"acc": 0.7655880031570639,
"acc_stderr": 0.011906130106237983
},
"harness|gsm8k|5": {
"acc": 0.3070507960576194,
"acc_stderr": 0.012705685723131696
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-80000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1075624
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Oscaraandersson/reveal | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
splits:
- name: train
num_bytes: 25247683.954605438
num_examples: 18187
- name: valid
num_bytes: 3155439.909210874
num_examples: 2273
- name: test
num_bytes: 3156828.13618369
num_examples: 2274
download_size: 11897933
dataset_size: 31559952.000000004
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
Ali-C137/masader4AOT-mini | ---
dataset_info:
features:
- name: Name
dtype: string
- name: Subsets
list:
- name: Dialect
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: Volume
dtype: string
- name: HF Link
dtype: string
- name: Link
dtype: string
splits:
- name: train
num_bytes: 45449
num_examples: 322
download_size: 23287
dataset_size: 45449
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tombryan/mbta-ridership | ---
license: mit
---
|
SebasV/autotrain-data-tableros_factibilidad | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: tableros_factibilidad
## Dataset Description
This dataset has been automatically processed by AutoTrain for project tableros_factibilidad.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<217x409 RGB PIL image>",
"target": 1
},
{
"image": "<311x574 RGB PIL image>",
"target": 2
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['sin adecuaciones', 'tablero', 'tablero 2 medidores', 'tablero varios medidores'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 13 |
| valid | 5 |
|
dev-ninja/one_piece_QA | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2958
num_examples: 89
download_size: 2190
dataset_size: 2958
---
# Dataset Card for "one_piece_QA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cdminix/libritts-r-aligned | ---
pretty_name: LibriTTS Corpus with Forced Alignments
annotations_creators:
- crowdsourced
language: en
tags:
- speech
- audio
- automatic-speech-recognition
- text-to-speech
license:
- cc-by-4.0
task_categories:
- automatic-speech-recognition
- text-to-speech
extra_gated_prompt: "When using this dataset to download LibriTTS, you agree to the terms on https://www.openslr.org"
---
> This dataset is identical to **[cdminix/libritts-aligned](https://huggingface.co/datasets/cdminix/libritts-aligned)** except it uses the newly released LibriTTS-R corpus. Please cite **[Y. Koizumi, et al., "LibriTTS-R: Restoration of a Large-Scale Multi-Speaker TTS Corpus", Interspeech 2023](https://google.github.io/df-conformer/librittsr/)**
*When using this dataset to download LibriTTS-R, make sure you agree to the terms on https://www.openslr.org*
# Dataset Card for LibriTTS-R with Forced Alignments (and Measures)
This dataset downloads LibriTTS-R and preprocesses it on your machine to create alignments using [montreal forced aligner](https://montreal-forced-aligner.readthedocs.io/en/latest/).
You need to run ``pip install alignments phones`` before using this dataset.
When running this the first time, it can take an hour or two, but subsequent runs will be lightning fast.
## Requirements
- ``pip install alignments phones`` **(required)**
- ``pip install speech-collator`` (optional)
*Note: version >=0.0.15 of alignments is required for this corpus*
## Example Item
```json
{
'id': '100_122655_000073_000002.wav',
'speaker': '100',
'text': 'the day after, diana and mary quitted it for distant b.',
'start': 0.0,
'end': 3.6500000953674316,
'phones': ['[SILENCE]', 'ð', 'ʌ', '[SILENCE]', 'd', 'eɪ', '[SILENCE]', 'æ', 'f', 't', 'ɜ˞', '[COMMA]', 'd', 'aɪ', 'æ', 'n', 'ʌ', '[SILENCE]', 'æ', 'n', 'd', '[SILENCE]', 'm', 'ɛ', 'ɹ', 'i', '[SILENCE]', 'k', 'w', 'ɪ', 't', 'ɪ', 'd', '[SILENCE]', 'ɪ', 't', '[SILENCE]', 'f', 'ɜ˞', '[SILENCE]', 'd', 'ɪ', 's', 't', 'ʌ', 'n', 't', '[SILENCE]', 'b', 'i', '[FULL STOP]'],
'phone_durations': [5, 2, 4, 0, 5, 13, 0, 16, 7, 5, 20, 2, 6, 9, 15, 4, 2, 0, 11, 3, 5, 0, 3, 8, 9, 8, 0, 13, 3, 5, 3, 6, 4, 0, 8, 5, 0, 9, 5, 0, 7, 5, 6, 7, 4, 5, 10, 0, 3, 35, 9],
'audio': '/dev/shm/metts/train-clean-360-alignments/100/100_122655_000073_000002.wav'
}
```
The phones are IPA phones, and the phone durations are in frames (assuming a hop length of 256, sample rate of 22050 and window length of 1024). These attributes can be changed using the ``hop_length``, ``sample_rate`` and ``window_length`` arguments to ``LibriTTSAlign``.
## Data Collator
This dataset comes with a data collator which can be used to create batches of data for training.
It can be installed using ``pip install speech-collator`` ([MiniXC/speech-collator](https://www.github.com/MiniXC/speech-collator)) and can be used as follows:
```python
import json
from datasets import load_dataset
from speech_collator import SpeechCollator
from torch.utils.data import DataLoader
dataset = load_dataset('cdminix/libritts-aligned', split="train")
speaker2ixd = json.load(open("speaker2idx.json"))
phone2ixd = json.load(open("phone2idx.json"))
collator = SpeechCollator(
speaker2ixd=speaker2idx,
phone2ixd=phone2idx ,
)
dataloader = DataLoader(dataset, collate_fn=collator.collate_fn, batch_size=8)
```
You can either download the ``speaker2idx.json`` and ``phone2idx.json`` files from [here](https://huggingface.co/datasets/cdminix/libritts-aligned/tree/main/data) or create them yourself using the following code:
```python
import json
from datasets import load_dataset
from speech_collator import SpeechCollator, create_speaker2idx, create_phone2idx
dataset = load_dataset("cdminix/libritts-aligned", split="train")
# Create speaker2idx and phone2idx
speaker2idx = create_speaker2idx(dataset, unk_idx=0)
phone2idx = create_phone2idx(dataset, unk_idx=0)
# save to json
with open("speaker2idx.json", "w") as f:
json.dump(speaker2idx, f)
with open("phone2idx.json", "w") as f:
json.dump(phone2idx, f)
```
### Measures
When using ``speech-collator`` you can also use the ``measures`` argument to specify which measures to use. The following example extracts Pitch and Energy on the fly.
```python
import json
from torch.utils.data import DataLoader
from datasets import load_dataset
from speech_collator import SpeechCollator, create_speaker2idx, create_phone2idx
from speech_collator.measures import PitchMeasure, EnergyMeasure
dataset = load_dataset("cdminix/libritts-aligned", split="train")
speaker2idx = json.load(open("data/speaker2idx.json"))
phone2idx = json.load(open("data/phone2idx.json"))
# Create SpeechCollator
speech_collator = SpeechCollator(
speaker2idx=speaker2idx,
phone2idx=phone2idx,
measures=[PitchMeasure(), EnergyMeasure()],
return_keys=["measures"]
)
# Create DataLoader
dataloader = DataLoader(
dataset,
batch_size=8,
collate_fn=speech_collator.collate_fn,
)
```
COMING SOON: Detailed documentation on how to use the measures at [MiniXC/speech-collator](https://www.github.com/MiniXC/speech-collator).
## Splits
This dataset has the following splits:
- ``train``: All the training data, except one sample per speaker which is used for validation.
- ``dev``: The validation data, one sample per speaker.
- ``train.clean.100``: Training set derived from the original materials of the train-clean-100 subset of LibriSpeech.
- ``train.clean.360``: Training set derived from the original materials of the train-clean-360 subset of LibriSpeech.
- ``train.other.500``: Training set derived from the original materials of the train-other-500 subset of LibriSpeech.
- ``dev.clean``: Validation set derived from the original materials of the dev-clean subset of LibriSpeech.
- ``dev.other``: Validation set derived from the original materials of the dev-other subset of LibriSpeech.
- ``test.clean``: Test set derived from the original materials of the test-clean subset of LibriSpeech.
- ``test.other``: Test set derived from the original materials of the test-other subset of LibriSpeech.
## Environment Variables
There are a few environment variable which can be set.
- ``LIBRITTS_VERBOSE``: If set, will print out more information about the dataset creation process.
- ``LIBRITTS_MAX_WORKERS``: The number of workers to use when creating the alignments. Defaults to ``cpu_count()``.
- ``LIBRITTS_PATH``: The path to download LibriTTS to. Defaults to the value of ``HF_DATASETS_CACHE``.
# Citation
When using LibriTTS-R please cite the following papers:
- [LibriTTS-R: Restoration of a Large-Scale Multi-Speaker TTS Corpus](https://google.github.io/df-conformer/librittsr/)
- [LibriTTS: A Corpus Derived from LibriSpeech for Text-to-Speech](https://arxiv.org/abs/1904.02882)
- [Montreal Forced Aligner: Trainable text-speech alignment using Kaldi](https://www.researchgate.net/publication/319185277_Montreal_Forced_Aligner_Trainable_Text-Speech_Alignment_Using_Kaldi)
When using the Measures please cite the following paper (ours):
- [Evaluating and reducing the distance between synthetic and real speech distributions](https://arxiv.org/abs/2211.16049) |
CyberHarem/w_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of w/W/W (Arknights)
This is the dataset of w/W/W (Arknights), containing 500 images and their tags.
The core tags of this character are `horns, short_hair, grey_hair, breasts, demon_horns, medium_breasts, red_eyes, ahoge, tail, multicolored_hair, orange_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.02 GiB | [Download](https://huggingface.co/datasets/CyberHarem/w_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 469.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/w_arknights/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1339 | 1.05 GiB | [Download](https://huggingface.co/datasets/CyberHarem/w_arknights/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 865.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/w_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1339 | 1.68 GiB | [Download](https://huggingface.co/datasets/CyberHarem/w_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/w_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, black_gloves, black_jacket, black_pantyhose, black_scarf, black_skirt, fingerless_gloves, grey_shirt, looking_at_viewer, miniskirt, open_jacket, solo, white_hair, long_sleeves, cowboy_shot, holding, nail_polish, open_mouth, red_nails, smile, grey_background, remote_control, white_shirt |
| 1 | 6 |  |  |  |  |  | 1girl, black_gloves, black_jacket, black_scarf, black_skirt, fingerless_gloves, grey_shirt, looking_at_viewer, open_jacket, red_nails, solo, :d, black_pantyhose, cowboy_shot, large_breasts, miniskirt, nail_polish, open_mouth, holding_remote_control |
| 2 | 6 |  |  |  |  |  | 1girl, black_footwear, black_gloves, black_jacket, black_pantyhose, black_scarf, black_skirt, fingerless_gloves, grey_shirt, looking_at_viewer, miniskirt, open_jacket, red_nails, smile, solo, antennae, demon_tail, nail_polish, shoes, full_body, white_hair |
| 3 | 6 |  |  |  |  |  | 1girl, black_gloves, black_jacket, black_scarf, fingerless_gloves, grey_shirt, looking_at_viewer, nail_polish, open_jacket, parted_lips, red_nails, simple_background, solo, upper_body, antenna_hair, grin, white_background, hair_between_eyes, scar_on_arm, white_hair |
| 4 | 6 |  |  |  |  |  | 1girl, black_gloves, black_jacket, black_scarf, black_skirt, cowboy_shot, fingerless_gloves, grey_shirt, long_sleeves, looking_at_viewer, miniskirt, open_jacket, smile, solo, yellow_eyes, parted_lips, black_pantyhose, large_breasts |
| 5 | 5 |  |  |  |  |  | 1girl, black_gloves, black_jacket, black_scarf, fingerless_gloves, grey_shirt, looking_at_viewer, nail_polish, open_jacket, red_nails, simple_background, solo, upper_body, hand_up, white_background, :d, open_mouth, grey_background, heart, long_sleeves |
| 6 | 7 |  |  |  |  |  | 1girl, black_jacket, black_scarf, looking_at_viewer, open_jacket, solo, upper_body, grey_shirt, parted_lips, simple_background, smile, white_background, white_hair, white_shirt, antennae, yellow_eyes |
| 7 | 6 |  |  |  |  |  | 1girl, bare_shoulders, black_jacket, black_pantyhose, black_skirt, large_breasts, looking_at_viewer, miniskirt, off_shoulder, open_jacket, sleeveless_shirt, solo, grey_shirt, black_scarf, simple_background, sitting, white_background, blush, cowboy_shot, smile, white_hair |
| 8 | 6 |  |  |  |  |  | 1girl, antennae, black_jacket, long_sleeves, looking_at_viewer, nail_polish, official_alternate_costume, open_jacket, red_nails, solo, bare_shoulders, black_gloves, fingerless_gloves, off_shoulder, white_shirt, white_tank_top, black_choker, large_breasts, smile, antenna_hair, black_shorts, black_thighhighs, demon_girl, demon_tail, hand_up, parted_lips, simple_background, white_background |
| 9 | 12 |  |  |  |  |  | 1girl, black_dress, looking_at_viewer, smile, solo, bare_shoulders, alternate_costume, demon_tail, drinking_glass, indoors, large_breasts, antenna_hair, bare_arms, demon_girl, sleeveless_dress, feet_out_of_frame, nail_polish, necklace, red_nails, black_pantyhose, holding_cup, parted_lips, red_hair, sideboob, sitting |
| 10 | 10 |  |  |  |  |  | 1girl, bare_shoulders, cleavage_cutout, looking_at_viewer, sleeveless_dress, solo, alternate_costume, antennae, demon_tail, double_bun, navel_cutout, red_dress, red_nails, simple_background, black_dress, china_dress, thighs, white_background, earrings, pelvic_curtain, red_hair, sitting, thigh_strap, antenna_hair, demon_girl, nail_polish |
| 11 | 10 |  |  |  |  |  | 1girl, black_headwear, demon_tail, solo, black_gloves, cleavage, demon_girl, hat, looking_at_viewer, official_alternate_costume, smile, single_elbow_glove, torn_pantyhose, asymmetrical_gloves, black_pantyhose, cowboy_shot, holding_mask, thighhighs, yellow_eyes, red_hair, thigh_boots, black_dress, black_footwear, open_mouth, teeth |
| 12 | 10 |  |  |  |  |  | 1girl, alternate_costume, alternate_hair_length, long_hair, solo, ballerina, bare_shoulders, looking_at_viewer, antenna_hair, tutu, white_dress, leotard, strapless_dress, ballet, cleavage, collarbone, demon_tail, white_pantyhose, yellow_eyes, antennae, armpits, black_choker, demon_girl, large_breasts, leg_up, parted_lips, red_hair, simple_background, standing_on_one_leg, thighband_pantyhose |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | black_jacket | black_pantyhose | black_scarf | black_skirt | fingerless_gloves | grey_shirt | looking_at_viewer | miniskirt | open_jacket | solo | white_hair | long_sleeves | cowboy_shot | holding | nail_polish | open_mouth | red_nails | smile | grey_background | remote_control | white_shirt | :d | large_breasts | holding_remote_control | black_footwear | antennae | demon_tail | shoes | full_body | parted_lips | simple_background | upper_body | antenna_hair | grin | white_background | hair_between_eyes | scar_on_arm | yellow_eyes | hand_up | heart | bare_shoulders | off_shoulder | sleeveless_shirt | sitting | blush | official_alternate_costume | white_tank_top | black_choker | black_shorts | black_thighhighs | demon_girl | black_dress | alternate_costume | drinking_glass | indoors | bare_arms | sleeveless_dress | feet_out_of_frame | necklace | holding_cup | red_hair | sideboob | cleavage_cutout | double_bun | navel_cutout | red_dress | china_dress | thighs | earrings | pelvic_curtain | thigh_strap | black_headwear | cleavage | hat | single_elbow_glove | torn_pantyhose | asymmetrical_gloves | holding_mask | thighhighs | thigh_boots | teeth | alternate_hair_length | long_hair | ballerina | tutu | white_dress | leotard | strapless_dress | ballet | collarbone | white_pantyhose | armpits | leg_up | standing_on_one_leg | thighband_pantyhose |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------|:---------------|:------------------|:--------------|:--------------|:--------------------|:-------------|:--------------------|:------------|:--------------|:-------|:-------------|:---------------|:--------------|:----------|:--------------|:-------------|:------------|:--------|:------------------|:-----------------|:--------------|:-----|:----------------|:-------------------------|:-----------------|:-----------|:-------------|:--------|:------------|:--------------|:--------------------|:-------------|:---------------|:-------|:-------------------|:--------------------|:--------------|:--------------|:----------|:--------|:-----------------|:---------------|:-------------------|:----------|:--------|:-----------------------------|:-----------------|:---------------|:---------------|:-------------------|:-------------|:--------------|:--------------------|:-----------------|:----------|:------------|:-------------------|:--------------------|:-----------|:--------------|:-----------|:-----------|:------------------|:-------------|:---------------|:------------|:--------------|:---------|:-----------|:-----------------|:--------------|:-----------------|:-----------|:------|:---------------------|:-----------------|:----------------------|:---------------|:-------------|:--------------|:--------|:------------------------|:------------|:------------|:-------|:--------------|:----------|:------------------|:---------|:-------------|:------------------|:----------|:---------|:----------------------|:----------------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | X | | X | X | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | X | | X | X | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | | X | | X | X | X | | X | X | X | | | | X | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | | | | | X | | | | | X | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | | X | | X | X | X | | X | X | | X | | | X | X | X | | X | | | X | | | | | | | | | X | X | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | | X | | X | | | X | X | | X | X | X | | | | | | | X | | | X | | | | | X | | | | X | X | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | X | X | X | X | | X | X | X | X | X | X | | X | | | | | X | | | | | X | | | | | | | | X | | | | X | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | X | X | | | | X | | X | | X | X | | X | | | X | | X | X | | | X | | X | | | X | X | | | X | X | | X | | X | | | | X | | X | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 12 |  |  |  |  |  | X | | | X | | | | | X | | | X | | | | | X | | X | X | | | | | X | | | | X | | | X | | | X | | | | | | | | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 10 |  |  |  |  |  | X | | | | | | | | X | | | X | | | | | X | | X | | | | | | | | | X | X | | | | X | | X | | X | | | | | | X | | | X | | | | | | | X | X | X | | | | X | | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 10 |  |  |  |  |  | X | X | | X | | | | | X | | | X | | | X | | | X | | X | | | | | | | X | | X | | | | | | | | | | | X | | | | | | | | X | | | | | X | X | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 12 | 10 |  |  |  |  |  | X | | | | | | | | X | | | X | | | | | | | | | | | | | X | | | X | X | | | X | X | | X | | | | | X | | | X | | | | | | | X | | | X | | X | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
huggingartists/obladaet | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/obladaet"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.43973 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/4411ffc50a3cd07d303d09a5db3b7cf5.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/obladaet">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">OBLADAET</div>
<a href="https://genius.com/artists/obladaet">
<div style="text-align: center; font-size: 14px;">@obladaet</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/obladaet).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/obladaet")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|121| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/obladaet")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
bupt/LawDataset-BUPT | ---
language:
- zh
tags:
- legal
pretty_name: LawDataset-BPUT
size_categories:
- 1M<n<10M
---
## LawDataset-BUPT ⚖️
Here is the full data from the Legal LLM project, from which we hope to build a high quality dataset.
Here's our [github project page](https://github.com/KLGR123/LegalLLM-BUPT).
If you want to make any contribution, please contact me QQ 2248157602.
### Data Source
Our data mainly comes from
- CrimeKgAssistant, 856 crime KG items / 2800k crime name_entities / 200k lawQA with 13 classes
- Tigerbot-law-plugin 55k laws provision data with 11 classes
- Wenshu_ms_dataset 45k law judgements data
- Lexilaw
- LawGPT-zh 52k QA data
- Lawyer_LLAMA law exam and instruction data
- hualv_webste_QA 20k law QA data
- baidu_zhidao_law_QA 36k law QA data
- BELLE general dataset 1.5M
For BELLE dataset and models, please download directly at [BELLE huggingface page](https://huggingface.co/datasets/BELLE-2/train_3.5M_CN_With_Category).
### Data Statistics
So far the dataset size is around
- Law QA data size: ~310k
- Law provision data size: ~55k
- Law judgement data size: ~45k
- General data size: ~1500k
### Data Fields
You can check the different data field for each source data.
Wenshu_ms_dataset 45k law judgements data
```
{
"Case": "王某甲与辽宁古田房地产有限公司房屋拆迁安置补偿合同纠纷一审民事判决书",
"CaseId": "7abb676880254ca79c34a90e0101bc8e",
"CaseProc": "民事一审",
"CaseRecord": "原告王某甲与被告辽宁古田房地产有限公司房屋拆迁安置补偿合同纠纷一案,本院于2018年4月26日受理后,依法由审判员雷凯独任审判,公开开庭进行了审理。原告王某甲与被告辽宁古田房地产有限公司的委托代理人李某、刘某某到庭参加诉讼。本案现已审理终结",
"CaseType": "民事案件",
"JudgeAccusation": "原告王某甲诉称:原告原住大东区XX,2009年动迁至2014年回迁,至今被告没给原告房屋补助款。原告多次向被告主张房屋补助款,被告总是说没钱等等再等等。后来被告用这笔款给原告折抵五年物业费(从2015.1.1至2019.12.31),剩余房屋补助费3万多,到现在一直没解决,故起诉至法院。请求法院判令1、被告给付原告房屋拆迁款48000元;2、起诉费由被告承担。\n被告辽宁古田房地产有限公司辩称:针对原告诉讼请求48000元,被告对此不予认可,原、被告双方于2016年9月21日签订了协议书一份,对双方拆迁安置补助费的具体数额进行了重新确认,顶5年物业费后,尚欠安置费33828元。现原告诉讼请求48000元无法律依据,应按双方签订的协议书继续履行,该协议书系双方真实意思表示,具有法律效力。\n经审理查明:2008年7月25日,原被告签订城市房屋拆迁补偿安置协议。2016年9月21日,原告与被告签订协议书,该协议约定逾期安置补助费为48000元,原被告双方同意按百分之八十即38400元进行全部抵顶。其中4572元抵顶原告房屋五年的物业费(从2015年1月1日至2019年12月31日期间),剩余33828元待被告资金充足时解决。原告在庭审中自述从2015年至今没有缴纳过物业费。\n上述事实,有城市房屋拆迁补偿安置协议、协议书等证据及原被告陈述,经开庭质证,本院予以确认,在卷佐证",
"JudgeReason": "本院认为:2016年9月21日,原告与被告签订协议书系双方真实的意思表示,内容不违反法律规定,合法有效,双方均应遵守。在该协议中,原被告协商一致在抵顶五年的物业费后,被告尚欠原告逾期安置补助费33828元,被告至今没有给付原告,故被告应当给付原告逾期安置补助费33828元。\n综上所述,根据《中华人民共和国合同法》第四十四条之规定,判决如下",
"JudgeResult": "一、被告辽宁古田房地产有限公司于本判决生效后十日内给付原告王某甲逾期安置补助费33828元;\n二、驳回原告王某甲的其他诉讼请求。\n如被告未按本判决所指定的期限履行给付义务,则应当依照《中华人民共和国民事诉讼法》第二百五十三条之规定,加倍支付迟延履行期间的债务利息。\n案件受理费1000元,减半收取500元,由原告王某甲负担177元,由被告辽宁古田房地产有限公司负担323元。\n如不服本判决,可在判决书送达之日起15日内向本院递交上诉状,并按对方当事人的人数提出副本,交纳上诉案件受理费,上诉于辽宁省沈阳市中级人民法院。如上诉期满后7日内未交纳上诉案件受理费,按自动撤回上诉处理",
"Keywords": [
"给付"
],
"Parties": [
{
"NameText": "王某甲",
"Name": "王某甲",
"LegalEntity": "Person",
"Prop": "原告"
},
{
"NameText": "辽宁古田房地产有限公司",
"Name": "辽宁古田房地产有限公司",
"LegalEntity": "Corporation",
"Prop": "被告"
}
],
"Category": {
"cat_1": "房地产纠纷",
"cat_2": "房产纠纷"
}
}
```
Tigerbot-law-plugin 55k laws provision data with 11 classes
```
{"type": "宪法", "title": "中华人民共和国宪法", "chapter1": "第一章 总纲", "content": "第六条 中华人民共和国的社会主义经济制度的基础是生产资料的社会主义公有制,即全民所有制和劳动群众集体所有制。社会主义公有制消灭人剥削人的制度,实行各尽所能、按劳分配的原则。\n国家在社会主义初级阶段,坚持公有制为主体、多种所有制经济共同发展的基本经济制度,坚持按劳分配为主体、多种分配方式并存的分配制度。", "chapter2": "", "chapter3": ""}
```
baidu_zhidao_law_QA 36k law QA data
```
title,question,reply,is_best
在法律中定金与订金的区别订金和定金哪个受,,“定金”是指当事人约定由一方向对方给付的,作为债权担保的一定数额的货币,它属于一种法律上的担保方式,目的在于促使债务人履行债务,保障债权人的债权得以实现。签合同时,对定金必需以书面形式进行约定,同时还应约定定金的数额和交付期限。给付定金一方如果不履行债务,无权要求另一方返还定金;接受定金的一方如果不履行债务,需向另一方双倍返还债务。债务人履行债务后,依照约定,定金应抵作价款或者收回。而“订金”目前我国法律没有明确规定,它不具备定金所具有的担保性质,可视为“预付款”,当合同不能履行时,除不可抗力外,应根据双方当事人的过错承担违约责任。,1
```
CrimeKgAssistant
```
{
"completion": "根据相关法律规定,未满14岁的人不得驾驶机动车辆,骑行电动车也属于机动车范畴。因此,未成年捏了电动车的刹车,可以认定为违法行为。同时,成年骑电动车的人也应当承担相应的民事责任。",
"prompt": "一14岁未成年骑自行车与一成年骑电动车相撞,T形路,未成年拐弯,未成年捏了刹车的电动车属于机动车吗??"
}
```
JEC-QA
```
{"answer": ["D"], "id": "3_2613", "option_list": {"A": "因未办理收养登记,包某与陈煜之间不存在法律上父子关系", "B": "陈煜作为包某生前抚养且无经济来源的人,可适当分得包某遗产", "C": "陈某的遗产由洪某与陈婴继承,陈煜不能继承", "D": "陈煜既可以继承陈某的遗产,也可以继承包某的遗产"}, "statement": "陈某与潘某离婚后,潘某带着2岁的儿子陈煜改嫁包某。陈某、潘某、包某三人订立收养协议,陈煜由包某收养,今后一切与陈某概无关系,但未办理收养登记。5年后,潘某与包某生下一女,取名包红。陈某离婚后,与洪某结婚,生女取名陈婴。几年后,陈某、包某相继去世。下列说法中正确的是:", "type": "1"}
```
|
Sujithanumala/AI_Interviewer | ---
dataset_info:
features:
- name: input_ids
dtype: string
- name: labels
dtype: string
splits:
- name: Train
num_bytes: 23599038
num_examples: 23081
download_size: 5044017
dataset_size: 23599038
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
ppierzc/ios-app-icons | ---
license: openrail
id: ios-app-icons
title: IOS App Icons
description: 'This dataset contains images and captions of iOS app icons collected
from the iOS Icon Gallery. The images have been processed using a Blip Image Captioning
model to generate captions.
'
tasks:
- image-captioning
tags:
- image-captioning
- ios-icons
created: December 20, 2023
citation: Author, A. et al. (2023). Your Dataset Name. [Hugging Face Datasets](https://huggingface.co/datasets/your_dataset_name).
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 367958490.476
num_examples: 1819
- name: test
num_bytes: 24842350.0
num_examples: 100
download_size: 338140473
dataset_size: 392800840.476
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# IOS App Icons
## Overview
This dataset contains images and captions of iOS app icons obtained from the iOS Icon Gallery. Each image is paired with a generated caption using a Blip Image Captioning model. The dataset is suitable for image captioning tasks and can be used to train and evaluate models for generating captions for iOS app icons.
## Images
The images are stored in the 'images' directory, and each image is uniquely identified with a filename (e.g., 'image_0.png'). The images have a resolution of 512x512 pixels.
## Data Format
The dataset is provided in the Hugging Face datasets format, with each sample containing the following information:
- `image_path`: Local file path to the image.
- `caption`: Generated caption for the corresponding image.
## Usage
You can use this dataset for training, fine-tuning, and evaluating image captioning models. The captions can be leveraged for tasks such as generating natural language descriptions for iOS app icons.
## Acknowledgments
- iOS Icon Gallery: [https://www.iosicongallery.com](https://www.iosicongallery.com)
- Blip Image Captioning model: [Salesforce/blip-image-captioning-large](https://huggingface.co/Salesforce/blip-image-captioning-large)
## License
This dataset is released under the [Apache-2.0 License](https://www.apache.org/licenses/LICENSE-2.0). Please review the license for details. |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/8e0339b4 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1335
dataset_size: 182
---
# Dataset Card for "8e0339b4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/satanichia_kurumizawa_mcdowell_gabrieldropout | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Satanichia Kurumizawa McDowell
This is the dataset of Satanichia Kurumizawa McDowell, containing 309 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 309 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 695 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 816 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 309 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 309 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 309 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 695 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 695 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 589 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 816 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 816 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
Nzham/PRED | ---
license: unknown
---
|
HuggingFaceM4/VisualMRC-nobbox | Invalid username or password. |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test-mathemakitt-c50da3-1597456332 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test
eval_info:
task: text_zero_shot_classification
model: facebook/opt-2.7b
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test
dataset_config: mathemakitten--winobias_antistereotype_test
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-2.7b
* Dataset: mathemakitten/winobias_antistereotype_test
* Config: mathemakitten--winobias_antistereotype_test
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
vira-chatbot/vira-intents-mod-gpt4-data-aug | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1091238
num_examples: 14251
- name: validation
num_bytes: 213834
num_examples: 2971
download_size: 0
dataset_size: 1305072
---
# Dataset Card for "vira-intents-mod-gpt4-data-aug"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vwxyzjn/cai-conversation-prod-h4-harmless | ---
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
- split: train_prefs
path: data/train_prefs-*
- split: test_prefs
path: data/test_prefs-*
dataset_info:
features:
- name: index
dtype: int64
- name: prompt
dtype: string
- name: init_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: init_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: critic_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: critic_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: revision_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: revision_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 80509011.39550994
num_examples: 21289
- name: test_sft
num_bytes: 4409523.340505145
num_examples: 1156
- name: train_prefs
num_bytes: 80509011.39550994
num_examples: 21289
- name: test_prefs
num_bytes: 4413337.807062675
num_examples: 1157
download_size: 52332286
dataset_size: 169840883.93858773
---
# Dataset Card for "cai-conversation-prod-h4-harmless"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
akkasi/go_emotions | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: labels
sequence: float64
- name: label2idx
dtype: string
- name: idx2label
dtype: string
splits:
- name: train
num_bytes: 210169067
num_examples: 168980
- name: test
num_bytes: 52552436
num_examples: 42245
download_size: 13348134
dataset_size: 262721503
---
# Dataset Card for "go_emotions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_32 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 13993521264.375
num_examples: 145693
download_size: 12371757603
dataset_size: 13993521264.375
---
# Dataset Card for "chunk_32"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Fiaa/ArtGenre | ---
configs:
- config_name: default
data_files:
- split: val
path: data/val-*
- split: test
path: data/test-*
dataset_info:
features:
- name: idx
dtype: int32
- name: image1
dtype: image
- name: image2
dtype: image
- name: image3
dtype: image
- name: image4
dtype: image
- name: choices
sequence: string
- name: image1_genre
dtype:
class_label:
names:
'0': abstract_painting
'1': cityscape
'2': genre_painting
'3': illustration
'4': landscape
'5': nude_painting
'6': portrait
'7': religious_painting
'8': sketch_and_study
'9': still_life
- name: image2_genre
dtype:
class_label:
names:
'0': abstract_painting
'1': cityscape
'2': genre_painting
'3': illustration
'4': landscape
'5': nude_painting
'6': portrait
'7': religious_painting
'8': sketch_and_study
'9': still_life
- name: image3_genre
dtype:
class_label:
names:
'0': abstract_painting
'1': cityscape
'2': genre_painting
'3': illustration
'4': landscape
'5': nude_painting
'6': portrait
'7': religious_painting
'8': sketch_and_study
'9': still_life
- name: image4_genre
dtype:
class_label:
names:
'0': abstract_painting
'1': cityscape
'2': genre_painting
'3': illustration
'4': landscape
'5': nude_painting
'6': portrait
'7': religious_painting
'8': sketch_and_study
'9': still_life
- name: answer
dtype: string
splits:
- name: val
num_bytes: 240278209.0
num_examples: 150
- name: test
num_bytes: 245751090.0
num_examples: 150
download_size: 480446228
dataset_size: 486029299.0
---
## Source
dataset created from https://huggingface.co/datasets/huggan/wikiart
## Task
Find two images that have the same genre
## Prompt:
```
The most common method for classifying art paintings is by genre (or theme). The term “genre” refers to the type of image that serves as the subject of a painting. The genre of a painting is classified as a cityscape, landscape, nude painting, portrait, religious painting, sketch and study, or still life.
Given the four images of art paintings, can you tell which two of them have the same genre?
Select between the following choices.
(A) ...
(B) ...
(C) ...
(D) ...
```
---
license: apache-2.0
dataset_info:
features:
- name: idx
dtype: int32
- name: image1
dtype: image
- name: image2
dtype: image
- name: image3
dtype: image
- name: image4
dtype: image
- name: choices
sequence: string
- name: image1_genre
dtype:
class_label:
names:
'0': abstract_painting
'1': cityscape
'2': genre_painting
'3': illustration
'4': landscape
'5': nude_painting
'6': portrait
'7': religious_painting
'8': sketch_and_study
'9': still_life
- name: image2_genre
dtype:
class_label:
names:
'0': abstract_painting
'1': cityscape
'2': genre_painting
'3': illustration
'4': landscape
'5': nude_painting
'6': portrait
'7': religious_painting
'8': sketch_and_study
'9': still_life
- name: image3_genre
dtype:
class_label:
names:
'0': abstract_painting
'1': cityscape
'2': genre_painting
'3': illustration
'4': landscape
'5': nude_painting
'6': portrait
'7': religious_painting
'8': sketch_and_study
'9': still_life
- name: image4_genre
dtype:
class_label:
names:
'0': abstract_painting
'1': cityscape
'2': genre_painting
'3': illustration
'4': landscape
'5': nude_painting
'6': portrait
'7': religious_painting
'8': sketch_and_study
'9': still_life
- name: answer
dtype: string
splits:
- name: test
num_bytes: 486029299.0
num_examples: 300
download_size: 480861315
dataset_size: 486029299.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
kitkatdafu/jester_embedding | ---
license: apache-2.0
---
|
HumanCentricART/P-AT | ---
license: gpl-3.0
task_categories:
- text-classification
language:
- en
tags:
- stereotype-detection
- bias-evaluation
---
# Measuring bias in Instruction-Following models with P-AT
Instruction-Following Language Models (IFLMs) are promising and versatile tools for solving many downstream, information-seeking tasks. Given their success, there is an urgent need to have a shared resource to determine whether existing and new IFLMs are prone to produce biased language interactions.
We propose **Prompt Association Test (P-AT)**, a resource for testing the presence of social biases in IFLMs.
P-AT stems from WEAT (Caliskan et al., 2017) and generalizes the notion of measuring social biases to IFLMs. The resource consists of 2310 questions and aims to help detect biases in IFLMs across multiple dimensions.
# Citation Info
More info on this dataset are available in our paper, [Measuring bias in Instruction-Following models with P-AT](https://aclanthology.org/2023.findings-emnlp.539/) in EMNLP 2023 Findings.
```
@inproceedings{
onorati-etal-2023-measuring,
title = "Measuring bias in Instruction-Following models with {P}-{AT}",
author = "Onorati, Dario and
Ruzzetti, Elena Sofia and
Venditti, Davide and
Ranaldi, Leonardo and
Zanzotto, Fabio Massimo",
editor = "Bouamor, Houda and
Pino, Juan and
Bali, Kalika",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2023",
month = dec,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.findings-emnlp.539",
doi = "10.18653/v1/2023.findings-emnlp.539",
pages = "8006--8034",
abstract = "Instruction-Following Language Models (IFLMs) are promising and versatile tools for solving many downstream, information-seeking tasks. Given their success, there is an urgent need to have a shared resource to determine whether existing and new IFLMs are prone to produce biased language interactions. In this paper, we propose Prompt Association Test (P-AT): a new resource for testing the presence of social biases in IFLMs. P-AT stems from WEAT (Caliskan et al., 2017) and generalizes the notion of measuring social biases to IFLMs. Basically, we cast WEAT word tests in promptized classification tasks, and we associate a metric - the bias score. Our resource consists of 2310 prompts. We then experimented with several families of IFLMs discovering gender and race biases in all the analyzed models. We expect P-AT to be an important tool for quantifying bias across different dimensions and, therefore, for encouraging the creation of fairer IFLMs before their distortions have consequences in the real world.",
}
``` |
py_ast | ---
pretty_name: PyAst
annotations_creators:
- machine-generated
language_creators:
- found
language:
- code
license:
- bsd-2-clause
- mit
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text2text-generation
- text-generation
- fill-mask
task_ids: []
paperswithcode_id: null
tags:
- code-modeling
- code-generation
dataset_info:
features:
- name: ast
sequence:
- name: type
dtype: string
- name: value
dtype: string
- name: children
sequence: int32
config_name: ast
splits:
- name: train
num_bytes: 1870790180
num_examples: 100000
- name: test
num_bytes: 907514993
num_examples: 50000
download_size: 526642289
dataset_size: 2778305173
---
# Dataset Card for [py_ast]
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **homepage**: [py150](https://www.sri.inf.ethz.ch/py150)
- **Paper**: [Probabilistic Model for Code with Decision Trees](https://www.semanticscholar.org/paper/Probabilistic-model-for-code-with-decision-trees-Raychev-Bielik/62e176977d439aac2e2d7eca834a7a99016dfcaf)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The dataset consists of parsed ASTs that were used to train and evaluate the DeepSyn tool.
The Python programs are collected from GitHub repositories
by removing duplicate files, removing project forks (copy of another existing repository),
keeping only programs that parse and have at most 30'000 nodes in the AST and
we aim to remove obfuscated files
### Supported Tasks and Leaderboards
Code Representation, Unsupervised Learning
### Languages
Python
## Dataset Structure
### Data Instances
A typical datapoint contains an AST of a python program, parsed.
The main key is `ast` wherein every program's AST is stored.
Each children would have,
`type` which will formulate the type of the node.
`children` which enumerates if a given node has children(non-empty list).
`value`, if the given node has any hardcoded value(else "N/A").
An example would be,
'''
[ {"type":"Module","children":[1,4]},{"type":"Assign","children":[2,3]},{"type":"NameStore","value":"x"},{"type":"Num","value":"7"}, {"type":"Print","children":[5]}, {"type":"BinOpAdd","children":[6,7]}, {"type":"NameLoad","value":"x"}, {"type":"Num","value":"1"} ]
'''
### Data Fields
- `ast`: a list of dictionaries, wherein every dictionary is a node in the Abstract Syntax Tree.
- `type`: explains the type of the node.
- `children`: list of nodes which are children under the given
- `value`: hardcoded value, if the node holds an hardcoded value.
### Data Splits
The data is split into a training and test set.
The final split sizes are as follows:
| | train | validation |
|------------------|--------:|------------:|
| py_ast examples | 100000 | 50000 |
## Dataset Creation
[More Information Needed]
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Raychev, V., Bielik, P., and Vechev, M
### Licensing Information
MIT, BSD and Apache
### Citation Information
@InProceedings{OOPSLA ’16, ACM,
title = {Probabilistic Model for Code with Decision Trees.},
authors={Raychev, V., Bielik, P., and Vechev, M.},
year={2016}
}
```
@inproceedings{10.1145/2983990.2984041,
author = {Raychev, Veselin and Bielik, Pavol and Vechev, Martin},
title = {Probabilistic Model for Code with Decision Trees},
year = {2016},
isbn = {9781450344449},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/2983990.2984041},
doi = {10.1145/2983990.2984041},
booktitle = {Proceedings of the 2016 ACM SIGPLAN International Conference on Object-Oriented Programming, Systems, Languages, and Applications},
pages = {731–747},
numpages = {17},
keywords = {Code Completion, Decision Trees, Probabilistic Models of Code},
location = {Amsterdam, Netherlands},
series = {OOPSLA 2016}
}
```
### Contributions
Thanks to [@reshinthadithyan](https://github.com/reshinthadithyan) for adding this dataset. |
vwxyzjn/openhermes-dev__mistralai_Mistral-7B-Instruct-v0.1__1707504194 | ---
dataset_info:
features:
- name: model
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: 'null'
- name: source
dtype: string
- name: title
dtype: 'null'
- name: topic
dtype: 'null'
- name: skip_prompt_formatting
dtype: bool
- name: idx
dtype: 'null'
- name: hash
dtype: 'null'
- name: views
dtype: 'null'
- name: custom_instruction
dtype: 'null'
- name: language
dtype: 'null'
- name: category
dtype: string
- name: id
dtype: 'null'
- name: model_name
dtype: 'null'
- name: prompt
dtype: string
- name: candidate0_policy
dtype: string
- name: candidate0
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate0_token_length
dtype: int64
- name: candidate1
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate1_policy
dtype: string
- name: candidate1_token_length
dtype: int64
splits:
- name: train
num_bytes: 184782.0
num_examples: 32
download_size: 172780
dataset_size: 184782.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
rthatha/GLDv2-Resized-Caption-Monza | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2053383.0
num_examples: 203
download_size: 2037879
dataset_size: 2053383.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
iraklisp/myNectar_max15k_3 | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 9862024
num_examples: 3300
download_size: 5412491
dataset_size: 9862024
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
NickyNicky/medical_bilingual_en_es | ---
dataset_info:
features:
- name: description
dtype: string
- name: medical_specialty
dtype: string
- name: sample_name
dtype: string
- name: transcription
dtype: string
splits:
- name: en
num_bytes: 12845119
num_examples: 4069
- name: es
num_bytes: 13894364
num_examples: 4069
download_size: 12814673
dataset_size: 26739483
configs:
- config_name: default
data_files:
- split: en
path: data/en-*
- split: es
path: data/es-*
---
```
https://huggingface.co/somosnlp/Sam_Diagnostic
```
|
CesarLeblanc/geoplantbert_text_classification_dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 269191471
num_examples: 273597
- name: test
num_bytes: 30235535
num_examples: 30732
download_size: 74164987
dataset_size: 299427006
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
liuyanchen1015/MULTI_VALUE_rte_it_is_referential | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 10806
num_examples: 22
- name: train
num_bytes: 5089
num_examples: 12
download_size: 22066
dataset_size: 15895
---
# Dataset Card for "MULTI_VALUE_rte_it_is_referential"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
renebustamante0204/ThaiSchool | ---
license: mit
---
|
atrost/geobenchmark_npee | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: noun
num_bytes: 98170
num_examples: 454
- name: choice
num_bytes: 29150
num_examples: 182
- name: completion
num_bytes: 23070
num_examples: 150
- name: tf
num_bytes: 12902
num_examples: 134
- name: qa
num_bytes: 133231
num_examples: 153
- name: discussion
num_bytes: 2620123
num_examples: 335
download_size: 1408820
dataset_size: 2916646
---
# Dataset Card for "geobenchmark_npee"
Dataset from https://github.com/davendw49/k2/blob/main/data/geobenchmark/geobenchmark_npee.json
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/isokaze_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of isokaze/磯風/矶风 (Azur Lane)
This is the dataset of isokaze/磯風/矶风 (Azur Lane), containing 39 images and their tags.
The core tags of this character are `animal_ears, green_hair, animal_ear_fluff, hair_ornament, long_hair, green_eyes, fang, thick_eyebrows, bangs, tail, hair_between_eyes, hairband, black_hairband, very_long_hair, fox_ears`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 39 | 46.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/isokaze_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 39 | 27.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/isokaze_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 88 | 58.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/isokaze_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 39 | 41.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/isokaze_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 88 | 81.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/isokaze_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/isokaze_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, :d, fur_trim, long_sleeves, looking_at_viewer, navel, open_clothes, open_mouth, solo, white_thighhighs, wide_sleeves, blush, claw_pose, hair_bell, jingle_bell, full_body, groin, hands_up, platform_footwear, short_eyebrows, standing, white_skirt, zouri, ass_visible_through_thighs, flat_chest, fox_tail, magatama_necklace, midriff, pleated_skirt, red_footwear, revealing_clothes, shide, sparkle, white_background |
| 1 | 11 |  |  |  |  |  | 1girl, hair_bell, jingle_bell, solo, wide_sleeves, blush, looking_at_viewer, open_mouth, black_thighhighs, long_sleeves, :d, white_dress, white_background, standing, cat_ear_legwear, folding_fan, hair_ribbon, holding_fan, bandages, black_capelet, cat_ears, full_body, paw_print, simple_background, tabi, tassel |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | :d | fur_trim | long_sleeves | looking_at_viewer | navel | open_clothes | open_mouth | solo | white_thighhighs | wide_sleeves | blush | claw_pose | hair_bell | jingle_bell | full_body | groin | hands_up | platform_footwear | short_eyebrows | standing | white_skirt | zouri | ass_visible_through_thighs | flat_chest | fox_tail | magatama_necklace | midriff | pleated_skirt | red_footwear | revealing_clothes | shide | sparkle | white_background | black_thighhighs | white_dress | cat_ear_legwear | folding_fan | hair_ribbon | holding_fan | bandages | black_capelet | cat_ears | paw_print | simple_background | tabi | tassel |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----|:-----------|:---------------|:--------------------|:--------|:---------------|:-------------|:-------|:-------------------|:---------------|:--------|:------------|:------------|:--------------|:------------|:--------|:-----------|:--------------------|:-----------------|:-----------|:--------------|:--------|:-----------------------------|:-------------|:-----------|:--------------------|:----------|:----------------|:---------------|:--------------------|:--------|:----------|:-------------------|:-------------------|:--------------|:------------------|:--------------|:--------------|:--------------|:-----------|:----------------|:-----------|:------------|:--------------------|:-------|:---------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | | X | X | | | X | X | | X | X | | X | X | X | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Seanxh/twitter_dataset_1713105942 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 28061
num_examples: 70
download_size: 16627
dataset_size: 28061
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Ashish-shukla/test_dataset | ---
license: openrail
task_categories:
- text-generation
language:
- en
pretty_name: tiny_demo
size_categories:
- n<1K
--- |
Sampath1987/NER_cyber_3 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2264589
num_examples: 2481
download_size: 414108
dataset_size: 2264589
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alinet/pmc_1mil | ---
license: mit
---
|
jzzcutler/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: milestone
struct:
- name: closed_at
dtype: string
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: comments
sequence: string
- name: created_at
dtype: timestamp[ns, tz=UTC]
- name: updated_at
dtype: timestamp[ns, tz=UTC]
- name: closed_at
dtype: timestamp[ns, tz=UTC]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: float64
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: float64
- name: state_reason
dtype: string
- name: draft
dtype: float64
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 25458344
num_examples: 4000
download_size: 7291887
dataset_size: 25458344
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aimingter/vae | ---
license: openrail
---
|
thiomajid/java_renaming_patch | ---
dataset_info:
features:
- name: commit_sha
dtype: string
- name: modifications
list:
- name: name
dtype: string
- name: new_args
dtype: string
- name: new_implementation
dtype: string
- name: new_return_type
dtype: string
- name: new_signature
dtype: string
- name: old_args
dtype: string
- name: old_implementation
dtype: string
- name: old_return_type
dtype: string
- name: old_signature
dtype: string
- name: files
list:
- name: additions
dtype: int64
- name: blob_url
dtype: string
- name: changes
dtype: int64
- name: contents_url
dtype: string
- name: deletions
dtype: int64
- name: filename
dtype: string
- name: new_code
dtype: string
- name: new_methods
list:
- name: arguments
sequence: string
- name: filename
dtype: string
- name: implementation
dtype: string
- name: signature
dtype: string
- name: old_code
dtype: string
- name: old_methods
list:
- name: arguments
sequence: string
- name: filename
dtype: string
- name: implementation
dtype: string
- name: signature
dtype: string
- name: patch
dtype: string
- name: raw_url
dtype: string
- name: sha
dtype: string
- name: status
dtype: string
- name: file_patch_count
dtype: int64
- name: enhanced_diffs
list:
- name: name
dtype: string
- name: new_args
dtype: string
- name: new_implementation
dtype: string
- name: new_return_type
dtype: string
- name: new_signature
dtype: string
- name: old_args
dtype: string
- name: old_implementation
dtype: string
- name: old_return_type
dtype: string
- name: old_signature
dtype: string
- name: enhanced_diffs_count
dtype: int64
- name: diff_count
dtype: int64
splits:
- name: base
num_bytes: 9935038
num_examples: 28
- name: semantic_removal
num_bytes: 1127827
num_examples: 8
- name: analyzed_form
num_bytes: 1223055
num_examples: 8
download_size: 2803982
dataset_size: 12285920
configs:
- config_name: default
data_files:
- split: base
path: data/base-*
- split: semantic_removal
path: data/semantic_removal-*
- split: analyzed_form
path: data/analyzed_form-*
---
# Dataset Card for "java_renaming_patch"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_42_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 10912131
num_examples: 7076
download_size: 5990613
dataset_size: 10912131
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_42_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/lmind_nq_v1_doc_qa | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train_qa
num_bytes: 34574
num_examples: 300
- name: train_recite_qa
num_bytes: 222533
num_examples: 300
- name: eval_qa
num_bytes: 11254
num_examples: 100
- name: eval_recite_qa
num_bytes: 73368
num_examples: 100
- name: all_docs
num_bytes: 248990
num_examples: 392
- name: train
num_bytes: 283564
num_examples: 692
- name: validation
num_bytes: 11254
num_examples: 100
download_size: 0
dataset_size: 885537
---
# Dataset Card for "lmind_nq_v1_doc_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hippocrates/iCliniq_train | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 12755267
num_examples: 7321
download_size: 6748421
dataset_size: 12755267
---
# Dataset Card for "iCliniq_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.