datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
CyberHarem/kasumi_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kasumi/かすみ/霞DOA (Azur Lane)
This is the dataset of kasumi/かすみ/霞DOA (Azur Lane), containing 500 images and their tags.
The core tags of this character are `breasts, brown_hair, long_hair, brown_eyes, large_breasts, ponytail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 593.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasumi_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 366.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasumi_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1062 | 705.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasumi_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 530.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasumi_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1062 | 936.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasumi_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kasumi_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, pelvic_curtain, solo, choker, sword, cleavage, white_panties, white_thighhighs, blush, sheathed, torn_clothes, japanese_clothes, open_mouth, weapon_on_back |
| 1 | 7 |  |  |  |  |  | 1girl, choker, cleavage, panties, pelvic_curtain, solo, huge_breasts, white_thighhighs, covered_nipples, areola_slip, blush |
| 2 | 6 |  |  |  |  |  | 1girl, bangs, choker, japanese_clothes, pelvic_curtain, puffy_short_sleeves, sash, solo, white_thighhighs, arm_guards, cleavage, looking_at_viewer, short_sword, thighs, weapon_on_back, hair_ribbon, lips, white_panties, collarbone, sheathed, simple_background, white_background, yellow_ribbon |
| 3 | 9 |  |  |  |  |  | 1girl, bangs, cleavage, hair_bow, japanese_clothes, pelvic_curtain, simple_background, single_braid, solo, white_background, white_thighhighs, arm_guards, blush, open_mouth, puffy_short_sleeves, reverse_grip, shiny_skin, short_sword, choker, shiny_hair, holding_sword, thighs, collarbone, yellow_bow, one_eye_closed |
| 4 | 5 |  |  |  |  |  | 1girl, bangs, fingernails, hair_bow, japanese_clothes, looking_at_viewer, pelvic_curtain, puffy_short_sleeves, shiny_hair, shiny_skin, simple_background, single_braid, solo, thighs, white_background, white_thighhighs, arm_guards, cleavage, open_mouth, white_panties, yellow_bow, ass, choker, short_sword, weapon_on_back, blush, looking_back |
| 5 | 8 |  |  |  |  |  | 1girl, arm_guards, bangs, japanese_clothes, pelvic_curtain, solo, white_panties, white_thighhighs, holding_sword, looking_at_viewer, marker_(medium), short_sword, ninja, hair_ribbon, parted_lips, short_sleeves, thighs, ass, cleavage, torn_thighhighs |
| 6 | 8 |  |  |  |  |  | 1girl, pelvic_curtain, solo, sword, single_braid, cleavage, white_thighhighs, ass, choker, white_panties, wind |
| 7 | 5 |  |  |  |  |  | 1girl, arm_guards, bangs, choker, cleavage, collarbone, hair_ribbon, pelvic_curtain, solo, white_background, japanese_clothes, sash, simple_background, white_thighhighs, bare_shoulders, looking_at_viewer, parted_lips, short_sleeves, side-tie_panties, white_panties, ass_visible_through_thighs, cherry_blossoms, petals, weapon_on_back, wind, yellow_ribbon |
| 8 | 8 |  |  |  |  |  | 1girl, nipples, blush, cum_in_pussy, solo, after_sex, female_pubic_hair, spread_legs, white_thighhighs, cumdrip, choker, mosaic_censoring, pelvic_curtain |
| 9 | 27 |  |  |  |  |  | 1girl, solo, blush, cleavage, smile, looking_at_viewer, navel, hair_ribbon, side-tie_bikini_bottom |
| 10 | 16 |  |  |  |  |  | hetero, solo_focus, mosaic_censoring, penis, 1girl, 1boy, nipples, blush, huge_breasts, nude, paizuri, cum, pussy, sex |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | pelvic_curtain | solo | choker | sword | cleavage | white_panties | white_thighhighs | blush | sheathed | torn_clothes | japanese_clothes | open_mouth | weapon_on_back | panties | huge_breasts | covered_nipples | areola_slip | bangs | puffy_short_sleeves | sash | arm_guards | looking_at_viewer | short_sword | thighs | hair_ribbon | lips | collarbone | simple_background | white_background | yellow_ribbon | hair_bow | single_braid | reverse_grip | shiny_skin | shiny_hair | holding_sword | yellow_bow | one_eye_closed | fingernails | ass | looking_back | marker_(medium) | ninja | parted_lips | short_sleeves | torn_thighhighs | wind | bare_shoulders | side-tie_panties | ass_visible_through_thighs | cherry_blossoms | petals | nipples | cum_in_pussy | after_sex | female_pubic_hair | spread_legs | cumdrip | mosaic_censoring | smile | navel | side-tie_bikini_bottom | hetero | solo_focus | penis | 1boy | nude | paizuri | cum | pussy | sex |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------------|:-------|:---------|:--------|:-----------|:----------------|:-------------------|:--------|:-----------|:---------------|:-------------------|:-------------|:-----------------|:----------|:---------------|:------------------|:--------------|:--------|:----------------------|:-------|:-------------|:--------------------|:--------------|:---------|:--------------|:-------|:-------------|:--------------------|:-------------------|:----------------|:-----------|:---------------|:---------------|:-------------|:-------------|:----------------|:-------------|:-----------------|:--------------|:------|:---------------|:------------------|:--------|:--------------|:----------------|:------------------|:-------|:-----------------|:-------------------|:-----------------------------|:------------------|:---------|:----------|:---------------|:------------|:--------------------|:--------------|:----------|:-------------------|:--------|:--------|:-------------------------|:---------|:-------------|:--------|:-------|:-------|:----------|:------|:--------|:------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | | X | | X | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | | X | X | X | | X | | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | X | X | X | | X | | X | X | | | X | X | | | | | | X | X | | X | | X | X | | | X | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | X | | X | X | X | X | | | X | X | X | | | | | X | X | | X | X | X | X | | | | X | X | | X | X | | X | X | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | X | X | | | X | X | X | | | | X | | | | | | | X | | | X | X | X | X | X | | | | | | | | | | | X | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | X | X | | X | X | X | | | | X | | X | | | | | X | | X | X | X | | | X | | X | X | X | X | | | | | | | | | | | | | | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 8 | 8 |  |  |  |  |  | X | X | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 9 | 27 |  |  |  |  |  | X | | X | | | X | | | X | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | |
| 10 | 16 |  |  |  |  |  | X | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | X | X | X | X | X | X | X | X | X |
|
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-9ea0d3-93467145852 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: google/pegasus-multi_news
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/pegasus-multi_news
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@sasha](https://huggingface.co/sasha) for evaluating this model. |
michaelmallari/mlb-statcast-batters | ---
license: mit
---
|
Roudranil/shakespearean-and-modern-english-conversational-dataset | ---
language:
- en
tags:
- fine-tuning
- shakespeare
pretty_name: SandMec
size_categories:
- n<10K
task-categories:
- text-generation
configs:
- config_name: default
data_files:
- split: train
path: "data/train.csv"
- split: test
path: "data/test.csv"
dataset_info:
features:
- name: id
dtype: string
- name: translated_dialog
dtype: string
- name: og_response
dtype: string
---
# Dataset Card for `Shakespearean and Modern English Conversational Dataset`
## Table of Contents
- [Dataset Card for `Shakespearean and Modern English Conversational Dataset`](#dataset-card-for-shakespearean-and-modern-english-conversational-dataset)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
## Dataset Description
- **Homepage:** [SandMec](https://roudranil.github.io/datasets/SandMec)
- **Repository:** [Roudranil/shakespearean-chatbot](https://github.com/Roudranil/finetuning-llms-for-conversation-in-shakespearean-english)
- **Point of Contact:** [roudranil@cmi.ac.in](mailto:roudranil@cmi.ac.in)
### Dataset Summary
This dataset contains dialog pairs taken from Shakespeare's works - the first dialog is a translated text in modern english, and the second dialog is it's actual response as written in Shakespeare's plays. See the [github repo](https://github.com/Roudranil/finetuning-llms-for-conversation-in-shakespearean-english) for more details. |
atmallen/animals_azaria_mitchell | ---
dataset_info:
features:
- name: statement
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 49381.093253968254
num_examples: 806
- name: test
num_bytes: 12375.906746031746
num_examples: 202
download_size: 23238
dataset_size: 61757.0
---
# Dataset Card for "animals_azaria_mitchell"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gayanin/babylon-native-v8-noise-op-wise | ---
dataset_info:
- config_name: del-0.1
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 645604
num_examples: 3893
- name: test
num_bytes: 69186
num_examples: 487
- name: validation
num_bytes: 73452
num_examples: 487
download_size: 444739
dataset_size: 788242
- config_name: del-0.2
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 612978
num_examples: 3893
- name: test
num_bytes: 65748
num_examples: 487
- name: validation
num_bytes: 69901
num_examples: 487
download_size: 426948
dataset_size: 748627
- config_name: del-0.3
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 579673
num_examples: 3893
- name: test
num_bytes: 62144
num_examples: 487
- name: validation
num_bytes: 66229
num_examples: 487
download_size: 406913
dataset_size: 708046
- config_name: del-0.4
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 546954
num_examples: 3893
- name: test
num_bytes: 59003
num_examples: 487
- name: validation
num_bytes: 62234
num_examples: 487
download_size: 387712
dataset_size: 668191
- config_name: del-0.5
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 514564
num_examples: 3893
- name: test
num_bytes: 55312
num_examples: 487
- name: validation
num_bytes: 58575
num_examples: 487
download_size: 368193
dataset_size: 628451
- config_name: ins-0.1
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 713756
num_examples: 3893
- name: test
num_bytes: 76595
num_examples: 487
- name: validation
num_bytes: 80904
num_examples: 487
download_size: 492280
dataset_size: 871255
- config_name: ins-0.2
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 749036
num_examples: 3893
- name: test
num_bytes: 80123
num_examples: 487
- name: validation
num_bytes: 85216
num_examples: 487
download_size: 520648
dataset_size: 914375
- config_name: ins-0.3
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 785112
num_examples: 3893
- name: test
num_bytes: 83719
num_examples: 487
- name: validation
num_bytes: 89042
num_examples: 487
download_size: 547838
dataset_size: 957873
- config_name: ins-0.4
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 819254
num_examples: 3893
- name: test
num_bytes: 87784
num_examples: 487
- name: validation
num_bytes: 93365
num_examples: 487
download_size: 573227
dataset_size: 1000403
- config_name: ins-0.5
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 855166
num_examples: 3893
- name: test
num_bytes: 91096
num_examples: 487
- name: validation
num_bytes: 97372
num_examples: 487
download_size: 599001
dataset_size: 1043634
- config_name: sub-0.1
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 679033
num_examples: 3893
- name: test
num_bytes: 72849
num_examples: 487
- name: validation
num_bytes: 77306
num_examples: 487
download_size: 473514
dataset_size: 829188
- config_name: sub-0.2
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 679965
num_examples: 3893
- name: test
num_bytes: 72976
num_examples: 487
- name: validation
num_bytes: 77427
num_examples: 487
download_size: 482941
dataset_size: 830368
- config_name: sub-0.3
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 680760
num_examples: 3893
- name: test
num_bytes: 73051
num_examples: 487
- name: validation
num_bytes: 77526
num_examples: 487
download_size: 486337
dataset_size: 831337
- config_name: sub-0.4
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 681700
num_examples: 3893
- name: test
num_bytes: 73165
num_examples: 487
- name: validation
num_bytes: 77577
num_examples: 487
download_size: 488283
dataset_size: 832442
- config_name: sub-0.5
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 682570
num_examples: 3893
- name: test
num_bytes: 73285
num_examples: 487
- name: validation
num_bytes: 77753
num_examples: 487
download_size: 489062
dataset_size: 833608
configs:
- config_name: del-0.1
data_files:
- split: train
path: del-0.1/train-*
- split: test
path: del-0.1/test-*
- split: validation
path: del-0.1/validation-*
- config_name: del-0.2
data_files:
- split: train
path: del-0.2/train-*
- split: test
path: del-0.2/test-*
- split: validation
path: del-0.2/validation-*
- config_name: del-0.3
data_files:
- split: train
path: del-0.3/train-*
- split: test
path: del-0.3/test-*
- split: validation
path: del-0.3/validation-*
- config_name: del-0.4
data_files:
- split: train
path: del-0.4/train-*
- split: test
path: del-0.4/test-*
- split: validation
path: del-0.4/validation-*
- config_name: del-0.5
data_files:
- split: train
path: del-0.5/train-*
- split: test
path: del-0.5/test-*
- split: validation
path: del-0.5/validation-*
- config_name: ins-0.1
data_files:
- split: train
path: ins-0.1/train-*
- split: test
path: ins-0.1/test-*
- split: validation
path: ins-0.1/validation-*
- config_name: ins-0.2
data_files:
- split: train
path: ins-0.2/train-*
- split: test
path: ins-0.2/test-*
- split: validation
path: ins-0.2/validation-*
- config_name: ins-0.3
data_files:
- split: train
path: ins-0.3/train-*
- split: test
path: ins-0.3/test-*
- split: validation
path: ins-0.3/validation-*
- config_name: ins-0.4
data_files:
- split: train
path: ins-0.4/train-*
- split: test
path: ins-0.4/test-*
- split: validation
path: ins-0.4/validation-*
- config_name: ins-0.5
data_files:
- split: train
path: ins-0.5/train-*
- split: test
path: ins-0.5/test-*
- split: validation
path: ins-0.5/validation-*
- config_name: sub-0.1
data_files:
- split: train
path: sub-0.1/train-*
- split: test
path: sub-0.1/test-*
- split: validation
path: sub-0.1/validation-*
- config_name: sub-0.2
data_files:
- split: train
path: sub-0.2/train-*
- split: test
path: sub-0.2/test-*
- split: validation
path: sub-0.2/validation-*
- config_name: sub-0.3
data_files:
- split: train
path: sub-0.3/train-*
- split: test
path: sub-0.3/test-*
- split: validation
path: sub-0.3/validation-*
- config_name: sub-0.4
data_files:
- split: train
path: sub-0.4/train-*
- split: test
path: sub-0.4/test-*
- split: validation
path: sub-0.4/validation-*
- config_name: sub-0.5
data_files:
- split: train
path: sub-0.5/train-*
- split: test
path: sub-0.5/test-*
- split: validation
path: sub-0.5/validation-*
---
|
Multimodal-Fatima/OxfordFlowers_test_facebook_opt_2.7b_Visclues_ns_6149 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_16
num_bytes: 267858097.375
num_examples: 6149
- name: fewshot_1_bs_16
num_bytes: 270237106.375
num_examples: 6149
- name: fewshot_3_bs_16
num_bytes: 274972317.375
num_examples: 6149
download_size: 797641513
dataset_size: 813067521.125
---
# Dataset Card for "OxfordFlowers_test_facebook_opt_2.7b_Visclues_ns_6149"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
StudentLLM/Open-Wyvern-74k | ---
task_categories:
- text-classification
- question-answering
- summarization
- conversational
- text-generation
language:
- en
size_categories:
- 10K<n<100K
---
<p align="center"><img src="https://cdn-uploads.huggingface.co/production/uploads/63e087b6a98d931aa90c1b9c/jm4fCY9DMGDxDRyhIeDZh.jpeg"></p>
# The Wyvern 🐉 Dataset
Let's introduce the **Wyvern 🐉** dataset, the new combination of datasets([Open-Orca](https://huggingface.co/datasets/Open-Orca/OpenOrca),
[Open-Platypus](https://huggingface.co/datasets/garage-bAInd/Open-Platypus), [airoboros](https://huggingface.co/datasets/jondurbin/airoboros-2.1),
[Dolly](https://huggingface.co/datasets/databricks/databricks-dolly-15k))!
We have integrated high-quality datasets following the claim that quality is more matter than quantity.
In addition, we have deduplicated the duplication of datasets to improve the dataset's quality because each dataset has some data contaminations.
Please see below for more details about the dataset!
# Dataset Details
**Wyvern 🐉** dataset is mixture of several datasets(Open-Orca, Open-Platypus, airoboros, Dolly) as mentioned above.
The specific configuration of the dataset is as follows.
(Open-Orca GPT-4 answered dataset was sampled using stratified sampling)
- **Open-Platypus(100%) + airoboros(100%) + Open-Orca(GPT-4)(5%)(stratified sampled) + Dolly-15k(100%)**
|Dataset Name|Sampled Size(ratio)|Deduped Size|License Type|
|---|---|---|---|
|[Open-Platypus](https://huggingface.co/datasets/garage-bAInd/Open-Platypus)|24.9k(100%)|16.8k|None|
|[airoboros](https://huggingface.co/datasets/jondurbin/airoboros-2.1)|36.3k(100%)|11k|apache-2.0|
|[Open-Orca](https://huggingface.co/datasets/Open-Orca/OpenOrca)|999.9k → 49.7k(5%)|35.6k|MIT|
|[Dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k)|15k(100%)|11k|cc-by-sa-3.0|
After the deduplication process, the size of the combination dataset is changed from 125k to 74k! (125k → 74k)
# Data Deduplication
We referred to Open-Platypus's [data similarity check code](https://github.com/arielnlee/Platypus/blob/main/data_pipeline/data_similarity.ipynb) to deduplicate the duplicated data.
The specific code for deduplication will be uploaded soon!
# Citations
```
@article{platypus2023,
title={Platypus: Quick, Cheap, and Powerful Refinement of LLMs},
author={Ariel N. Lee and Cole J. Hunter and Nataniel Ruiz},
booktitle={arXiv preprint arxiv:2308.07317},
year={2023}
}
```
```
@misc{OpenOrca,
title = {OpenOrca: An Open Dataset of GPT Augmented FLAN Reasoning Traces},
author = {Wing Lian and Bleys Goodson and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://https://huggingface.co/Open-Orca/OpenOrca},
}
```
```
@online{DatabricksBlog2023DollyV2,
author = {Mike Conover and Matt Hayes and Ankit Mathur and Jianwei Xie and Jun Wan and Sam Shah and Ali Ghodsi and Patrick Wendell and Matei Zaharia and Reynold Xin},
title = {Free Dolly: Introducing the World's First Truly Open Instruction-Tuned LLM},
year = {2023},
url = {https://www.databricks.com/blog/2023/04/12/dolly-first-open-commercially-viable-instruction-tuned-llm},
urldate = {2023-06-30}
}
``` |
lucadiliello/squad_as2 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 98242758
num_examples: 441978
- name: dev
num_bytes: 6088351
num_examples: 26677
- name: test
num_bytes: 6161786
num_examples: 26925
download_size: 16183526
dataset_size: 110492895
---
# Dataset Card for "squad_as2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ednalaxer/datacomp_small_clip1_30pct_asciichr_greater_than_4 | ---
dataset_info:
features:
- name: uid
dtype: string
- name: url
dtype: string
- name: text
dtype: string
- name: original_width
dtype: int64
- name: original_height
dtype: int64
- name: clip_b32_similarity_score
dtype: float32
- name: clip_l14_similarity_score
dtype: float32
- name: face_bboxes
sequence:
sequence: float64
- name: sha256
dtype: string
- name: detected_language
dtype: string
splits:
- name: train
num_bytes: 1169130941.7924173
num_examples: 3642339
download_size: 985672181
dataset_size: 1169130941.7924173
---
# Dataset Card for "datacomp_small_clip1_30pct_asciichr_greater_than_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/sten_mkii_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sten_mkii/ステンMK-II/司登MkⅡ (Girls' Frontline)
This is the dataset of sten_mkii/ステンMK-II/司登MkⅡ (Girls' Frontline), containing 34 images and their tags.
The core tags of this character are `twintails, blonde_hair, long_hair, hat, beret, red_headwear, breasts, ribbon, yellow_eyes, bangs, hair_ribbon, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 34 | 38.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sten_mkii_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 34 | 24.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sten_mkii_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 82 | 50.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sten_mkii_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 34 | 35.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sten_mkii_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 82 | 69.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sten_mkii_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sten_mkii_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, solo, white_background, simple_background, skirt, blush, looking_at_viewer, red_jacket, white_shirt, open_mouth |
| 1 | 6 |  |  |  |  |  | brown_skirt, collared_shirt, white_shirt, black_ribbon, long_sleeves, open_jacket, plaid_skirt, pleated_skirt, red_jacket, 1girl, brown_eyes, brown_footwear, hair_between_eyes, kneehighs, looking_at_viewer, shoes, solo, thighhighs, white_background, asymmetrical_legwear, blush, closed_mouth, dress_shirt, full_body, gun, school_uniform, simple_background, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | white_background | simple_background | skirt | blush | looking_at_viewer | red_jacket | white_shirt | open_mouth | brown_skirt | collared_shirt | black_ribbon | long_sleeves | open_jacket | plaid_skirt | pleated_skirt | brown_eyes | brown_footwear | hair_between_eyes | kneehighs | shoes | thighhighs | asymmetrical_legwear | closed_mouth | dress_shirt | full_body | gun | school_uniform | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------------------|:--------------------|:--------|:--------|:--------------------|:-------------|:--------------|:-------------|:--------------|:-----------------|:---------------|:---------------|:--------------|:--------------|:----------------|:-------------|:-----------------|:--------------------|:------------|:--------|:-------------|:-----------------------|:---------------|:--------------|:------------|:------|:-----------------|:-----------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
alayaran/bodo_english_parallel_valid | ---
license: mit
---
|
naem1023/augmented-kowiki | ---
license: apache-2.0
---
|
faziletgokbudak/instructpix2pix-clip-filtered | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edited_prompt
dtype: string
- name: SH_light
dtype: image
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 1625559685.0
num_examples: 500
download_size: 802197008
dataset_size: 1625559685.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/kafka_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kafka/カフカ/卡夫卡 (Arknights)
This is the dataset of kafka/カフカ/卡夫卡 (Arknights), containing 38 images and their tags.
The core tags of this character are `brown_hair, long_hair, yellow_eyes, hair_between_eyes, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 38 | 58.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kafka_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 38 | 50.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kafka_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 97 | 99.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kafka_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kafka_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, green_jacket, looking_at_viewer, red_ribbon, solo, long_sleeves, smile, hair_ribbon, red_dress, bow, white_socks, holding, ponytail, breasts, full_body, black_footwear, christmas, gift_box, red_skirt, simple_background |
| 1 | 20 |  |  |  |  |  | 1girl, solo, looking_at_viewer, holding, hood_up, simple_background, white_background, fingerless_gloves, black_jacket, black_skirt, white_shirt, bandaid, coat, grin |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | green_jacket | looking_at_viewer | red_ribbon | solo | long_sleeves | smile | hair_ribbon | red_dress | bow | white_socks | holding | ponytail | breasts | full_body | black_footwear | christmas | gift_box | red_skirt | simple_background | hood_up | white_background | fingerless_gloves | black_jacket | black_skirt | white_shirt | bandaid | coat | grin |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------------|:-------------|:-------|:---------------|:--------|:--------------|:------------|:------|:--------------|:----------|:-----------|:----------|:------------|:-----------------|:------------|:-----------|:------------|:--------------------|:----------|:-------------------|:--------------------|:---------------|:--------------|:--------------|:----------|:-------|:-------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 20 |  |  |  |  |  | X | | X | | X | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
wesslen/ecfr-title-12 | ---
dataset_info:
features:
- name: text
dtype: string
- name: meta
struct:
- name: chapter
sequence: string
- name: chapter_title
sequence: string
- name: subchapter
sequence: string
- name: subchapter_title
sequence: string
- name: part
sequence: string
- name: part_title
sequence: string
- name: section
sequence: string
- name: section_title
sequence: string
splits:
- name: train
num_bytes: 16669304
num_examples: 4665
download_size: 5913311
dataset_size: 16669304
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
caiobd/alpaca-data-pt-br-autotrain | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 19628487
num_examples: 51759
download_size: 11101501
dataset_size: 19628487
---
# Dataset Card for "alpaca-data-pt-br-autotrain"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
notrichardren/azaria-mitchell-diff-filtered-2 | ---
configs:
- config_name: default
data_files:
- split: cities
path: data/cities-*
- split: companies
path: data/companies-*
- split: animals
path: data/animals-*
- split: elements
path: data/elements-*
- split: inventions
path: data/inventions-*
- split: facts
path: data/facts-*
dataset_info:
features:
- name: claim
dtype: string
- name: label
dtype: int64
- name: dataset
dtype: string
- name: qa_type
dtype: int64
- name: ind
dtype: int64
splits:
- name: cities
num_bytes: 311504
num_examples: 4416
- name: companies
num_bytes: 86125
num_examples: 777
- name: animals
num_bytes: 60222
num_examples: 692
- name: elements
num_bytes: 52499
num_examples: 636
- name: inventions
num_bytes: 49480
num_examples: 594
- name: facts
num_bytes: 43529
num_examples: 472
download_size: 209164
dataset_size: 603359
---
# Dataset Card for "azaria-mitchell-diff-filtered-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SDbiaseval/dataset-v-1.4_CLIP_identities_random_seeds | ---
dataset_info:
features:
- name: adjective
dtype: string
- name: profession
dtype: string
- name: 'no'
dtype: int32
- name: image_path
dtype: string
- name: image
dtype: image
- name: gender
dtype: string
- name: identity
dtype: string
splits:
- name: train
num_bytes: 1172792739.5
num_examples: 31500
download_size: 1167658244
dataset_size: 1172792739.5
---
# Dataset Card for "dataset-v-1.4_CLIP_identities_random_seeds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_blueapple8259__TinyStories-Alpaca | ---
pretty_name: Evaluation run of blueapple8259/TinyStories-Alpaca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [blueapple8259/TinyStories-Alpaca](https://huggingface.co/blueapple8259/TinyStories-Alpaca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_blueapple8259__TinyStories-Alpaca_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T12:08:32.889015](https://huggingface.co/datasets/open-llm-leaderboard/details_blueapple8259__TinyStories-Alpaca_public/blob/main/results_2023-11-13T12-08-32.889015.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2343052459270292,\n\
\ \"acc_stderr\": 0.030014283954142254,\n \"acc_norm\": 0.2339194036543238,\n\
\ \"acc_norm_stderr\": 0.030804772038430715,\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041834,\n \"mc2\": 0.46675301460809676,\n\
\ \"mc2_stderr\": 0.016264340534335325,\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.00036305608931191567,\n \"f1\": 0.008077810402684559,\n\
\ \"f1_stderr\": 0.000561047245736677\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.20392491467576793,\n \"acc_stderr\": 0.011774262478702259,\n\
\ \"acc_norm\": 0.23976109215017063,\n \"acc_norm_stderr\": 0.012476304127453961\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25781716789484166,\n\
\ \"acc_stderr\": 0.004365388351563101,\n \"acc_norm\": 0.24915355506871142,\n\
\ \"acc_norm_stderr\": 0.004316389476434519\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.0256042334708991,\n\
\ \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.0256042334708991\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\
\ \"acc_stderr\": 0.034370793441061344,\n \"acc_norm\": 0.2152777777777778,\n\
\ \"acc_norm_stderr\": 0.034370793441061344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.17,\n\
\ \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \
\ \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.0285048564705142,\n\
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.0285048564705142\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.02286083830923207,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.02286083830923207\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523812,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523812\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.19032258064516128,\n\
\ \"acc_stderr\": 0.022331707611823085,\n \"acc_norm\": 0.19032258064516128,\n\
\ \"acc_norm_stderr\": 0.022331707611823085\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.03127090713297698,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.03127090713297698\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.15656565656565657,\n \"acc_stderr\": 0.025890520358141454,\n \"\
acc_norm\": 0.15656565656565657,\n \"acc_norm_stderr\": 0.025890520358141454\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.15544041450777202,\n \"acc_stderr\": 0.026148483469153314,\n\
\ \"acc_norm\": 0.15544041450777202,\n \"acc_norm_stderr\": 0.026148483469153314\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2076923076923077,\n \"acc_stderr\": 0.020567539567246794,\n\
\ \"acc_norm\": 0.2076923076923077,\n \"acc_norm_stderr\": 0.020567539567246794\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22962962962962963,\n \"acc_stderr\": 0.025644108639267634,\n \
\ \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.025644108639267634\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341923,\n\
\ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341923\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.18543046357615894,\n \"acc_stderr\": 0.03173284384294284,\n \"\
acc_norm\": 0.18543046357615894,\n \"acc_norm_stderr\": 0.03173284384294284\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.2018348623853211,\n \"acc_stderr\": 0.017208579357787572,\n \"\
acc_norm\": 0.2018348623853211,\n \"acc_norm_stderr\": 0.017208579357787572\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.125,\n \"acc_stderr\": 0.022554842722407934,\n \"acc_norm\": 0.125,\n\
\ \"acc_norm_stderr\": 0.022554842722407934\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n \
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\"\
: 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460295,\n \"\
acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460295\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.19730941704035873,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.19730941704035873,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462202,\n\
\ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462202\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212095,\n \"\
acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212095\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23931623931623933,\n\
\ \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.23931623931623933,\n\
\ \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24904214559386972,\n\
\ \"acc_stderr\": 0.015464676163395969,\n \"acc_norm\": 0.24904214559386972,\n\
\ \"acc_norm_stderr\": 0.015464676163395969\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.20261437908496732,\n \"acc_stderr\": 0.02301544687798565,\n\
\ \"acc_norm\": 0.20261437908496732,\n \"acc_norm_stderr\": 0.02301544687798565\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.20257234726688103,\n\
\ \"acc_stderr\": 0.022827317491059675,\n \"acc_norm\": 0.20257234726688103,\n\
\ \"acc_norm_stderr\": 0.022827317491059675\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2345679012345679,\n \"acc_stderr\": 0.023576881744005716,\n\
\ \"acc_norm\": 0.2345679012345679,\n \"acc_norm_stderr\": 0.023576881744005716\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.21631205673758866,\n \"acc_stderr\": 0.0245617205605628,\n \
\ \"acc_norm\": 0.21631205673758866,\n \"acc_norm_stderr\": 0.0245617205605628\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24641460234680573,\n\
\ \"acc_stderr\": 0.011005971399927234,\n \"acc_norm\": 0.24641460234680573,\n\
\ \"acc_norm_stderr\": 0.011005971399927234\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.03831305140884601,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.03831305140884601\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21890547263681592,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.21890547263681592,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n\
\ \"acc_stderr\": 0.031755547866299194,\n \"acc_norm\": 0.21084337349397592,\n\
\ \"acc_norm_stderr\": 0.031755547866299194\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041834,\n \"mc2\": 0.46675301460809676,\n\
\ \"mc2_stderr\": 0.016264340534335325\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5185477505919495,\n \"acc_stderr\": 0.014042813708888378\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0012583892617449664,\n \
\ \"em_stderr\": 0.00036305608931191567,\n \"f1\": 0.008077810402684559,\n\
\ \"f1_stderr\": 0.000561047245736677\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/blueapple8259/TinyStories-Alpaca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|arc:challenge|25_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|drop|3_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|gsm8k|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hellaswag|10_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T12-08-32.889015.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T12-08-32.889015.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- '**/details_harness|winogrande|5_2023-11-13T12-08-32.889015.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T12-08-32.889015.parquet'
- config_name: results
data_files:
- split: 2023_11_13T12_08_32.889015
path:
- results_2023-11-13T12-08-32.889015.parquet
- split: latest
path:
- results_2023-11-13T12-08-32.889015.parquet
---
# Dataset Card for Evaluation run of blueapple8259/TinyStories-Alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/blueapple8259/TinyStories-Alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [blueapple8259/TinyStories-Alpaca](https://huggingface.co/blueapple8259/TinyStories-Alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_blueapple8259__TinyStories-Alpaca_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T12:08:32.889015](https://huggingface.co/datasets/open-llm-leaderboard/details_blueapple8259__TinyStories-Alpaca_public/blob/main/results_2023-11-13T12-08-32.889015.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2343052459270292,
"acc_stderr": 0.030014283954142254,
"acc_norm": 0.2339194036543238,
"acc_norm_stderr": 0.030804772038430715,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041834,
"mc2": 0.46675301460809676,
"mc2_stderr": 0.016264340534335325,
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931191567,
"f1": 0.008077810402684559,
"f1_stderr": 0.000561047245736677
},
"harness|arc:challenge|25": {
"acc": 0.20392491467576793,
"acc_stderr": 0.011774262478702259,
"acc_norm": 0.23976109215017063,
"acc_norm_stderr": 0.012476304127453961
},
"harness|hellaswag|10": {
"acc": 0.25781716789484166,
"acc_stderr": 0.004365388351563101,
"acc_norm": 0.24915355506871142,
"acc_norm_stderr": 0.004316389476434519
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.0256042334708991,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.0256042334708991
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.034370793441061344,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.034370793441061344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.0285048564705142,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.0285048564705142
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.02286083830923207,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.02286083830923207
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523812,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523812
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.19032258064516128,
"acc_stderr": 0.022331707611823085,
"acc_norm": 0.19032258064516128,
"acc_norm_stderr": 0.022331707611823085
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.03127090713297698,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.03127090713297698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.15656565656565657,
"acc_stderr": 0.025890520358141454,
"acc_norm": 0.15656565656565657,
"acc_norm_stderr": 0.025890520358141454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.15544041450777202,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.15544041450777202,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2076923076923077,
"acc_stderr": 0.020567539567246794,
"acc_norm": 0.2076923076923077,
"acc_norm_stderr": 0.020567539567246794
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.025644108639267634,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.025644108639267634
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2605042016806723,
"acc_stderr": 0.028510251512341923,
"acc_norm": 0.2605042016806723,
"acc_norm_stderr": 0.028510251512341923
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.18543046357615894,
"acc_stderr": 0.03173284384294284,
"acc_norm": 0.18543046357615894,
"acc_norm_stderr": 0.03173284384294284
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2018348623853211,
"acc_stderr": 0.017208579357787572,
"acc_norm": 0.2018348623853211,
"acc_norm_stderr": 0.017208579357787572
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.125,
"acc_stderr": 0.022554842722407934,
"acc_norm": 0.125,
"acc_norm_stderr": 0.022554842722407934
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460295,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460295
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.19730941704035873,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.19730941704035873,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.04010358942462202,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.04010358942462202
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3305785123966942,
"acc_stderr": 0.04294340845212095,
"acc_norm": 0.3305785123966942,
"acc_norm_stderr": 0.04294340845212095
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23931623931623933,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.23931623931623933,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24904214559386972,
"acc_stderr": 0.015464676163395969,
"acc_norm": 0.24904214559386972,
"acc_norm_stderr": 0.015464676163395969
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.20261437908496732,
"acc_stderr": 0.02301544687798565,
"acc_norm": 0.20261437908496732,
"acc_norm_stderr": 0.02301544687798565
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.20257234726688103,
"acc_stderr": 0.022827317491059675,
"acc_norm": 0.20257234726688103,
"acc_norm_stderr": 0.022827317491059675
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2345679012345679,
"acc_stderr": 0.023576881744005716,
"acc_norm": 0.2345679012345679,
"acc_norm_stderr": 0.023576881744005716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.21631205673758866,
"acc_stderr": 0.0245617205605628,
"acc_norm": 0.21631205673758866,
"acc_norm_stderr": 0.0245617205605628
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24641460234680573,
"acc_stderr": 0.011005971399927234,
"acc_norm": 0.24641460234680573,
"acc_norm_stderr": 0.011005971399927234
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884601,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884601
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21890547263681592,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.21890547263681592,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.031755547866299194,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.031755547866299194
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041834,
"mc2": 0.46675301460809676,
"mc2_stderr": 0.016264340534335325
},
"harness|winogrande|5": {
"acc": 0.5185477505919495,
"acc_stderr": 0.014042813708888378
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931191567,
"f1": 0.008077810402684559,
"f1_stderr": 0.000561047245736677
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
fightfei/test-course-desc | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4787
num_examples: 36
- name: test
num_bytes: 505
num_examples: 4
download_size: 4698
dataset_size: 5292
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
ura-hcmut/synthetic_reasoning | ---
license: cc-by-nc-sa-4.0
task_categories:
- text2text-generation
language:
- vi
configs:
- config_name: induction_gcp
data_files:
- split: train
path: synthetic_reasoning_gcp_induction_training.csv
- split: test
path: synthetic_reasoning_gcp_induction.csv
- config_name: induction_azr
data_files:
- split: train
path: synthetic_reasoning_azr_induction_training.csv
- split: test
path: synthetic_reasoning_azr_induction.csv
- config_name: pattern_match_gcp
data_files:
- split: train
path: synthetic_reasoning_gcp_pattern_match_training.csv
- split: test
path: synthetic_reasoning_gcp_pattern_match.csv
- config_name: pattern_match_azr
data_files:
- split: train
path: synthetic_reasoning_azr_pattern_match_training.csv
- split: test
path: synthetic_reasoning_azr_pattern_match.csv
- config_name: variable_substitution_gcp
data_files:
- split: train
path: synthetic_reasoning_gcp_variable_substitution_training.csv
- split: test
path: synthetic_reasoning_gcp_variable_substitution.csv
- config_name: variable_substitution_azr
data_files:
- split: train
path: synthetic_reasoning_azr_variable_substitution_training.csv
- split: test
path: synthetic_reasoning_azr_variable_substitution.csv
---
# Synthetic reasoning dataset
Original version:
- https://huggingface.co/datasets/lighteval/synthetic_reasoning
Translation source code: https://github.com/martinakaduc/ura-llama/tree/main/dataset_scripts/custom_datasets |
bene-ges/spellmapper_en_train_micro | ---
license: cc-by-4.0
language:
- en
---
This is a micro dataset used by the example [training script](https://github.com/NVIDIA/NeMo/blob/stable/examples/nlp/spellchecking_asr_customization/run_training.sh) for [SpellMapper](https://arxiv.org/abs/2306.02317) model.
A pretrained checkpoint is [available](https://huggingface.co/bene-ges/spellmapper_asr_customization_en). |
microsoft/timewarp | ---
license: mit
---
# Timewarp datasets
This dataset contains molecular dynamics simulation data that was used to train the neural networks in the NeurIPS 2023 paper [Timewarp: Transferable Acceleration of Molecular Dynamics by Learning Time-Coarsened Dynamics](https://arxiv.org/abs/2302.01170).
This dataset consists of many molecular dynamics trajectories of small peptides (2-4 amino acids) simulated with an implicit water force field.
For each protein two files are available:
* `protein-state0.pdb`: contains the topology and initial 3D XYZ coordinates.
* `protein-arrays.npz`: contains trajectory information.
The datasets are are split into the following directories:
# 2AA-1-big "Two Amino Acid" data set
This folder contains a data set of all-atom molecular dynamics trajectories for 380
of the 400 dipeptides, i.e. small proteins composed of two amino acids.
This dataset was orginally created missing 20 of the 400 possible dipeptides.
The `2AA-1-complete` dataset completes this by including all 400.
Each peptide is simulated using classical molecular dynamics and the
water is simulated using an implicit water model.
The trajectories are only saved every 10000 MD steps. There is no intermediate
spacing as for the other datasets for the Timewarp project.
# 2AA-1-complete "Two Amino Acid" data set
This folder contains a data set of all-atom molecular dynamics trajectories for all 400
dipeptides, i.e. small proteins composed of two amino acids.
This includes also the peptides missing in the other 2AA datasets.
Each peptide is simulated using classical molecular dynamics and the
water is simulated using an implicit water model.
# 4AA-huge "Four Amino Acid" data set, tetrapeptides
This folder contains a data set of all-atom molecular dynamics trajectories for
tetrapeptides, i.e. small proteins composed of four amino acids.
The data set contains mostly validation and test trajectories as it was mostly
used to validation and test purposes.
The training trajectories used are usually shorter.
Each peptide is simulated for 1 micro second using classical molecular dynamics and the
water is simulated using an implicit water model.
# 4AA-large "Four Amino Acid" data set, tetrapeptides
This folder contains a data set of all-atom molecular dynamics trajectories for
2333 tetrapeptides, i.e. small proteins composed of four amino acids.
The data set is split into 1500 tetra-peptides in the train set, 400 in validation, and 433 in test.
Each peptide in the train set is simulated for 50ns using classical molecular dynamics and the
water is simulated using an implicit water model. Each other peptide is simulated for 500ns.
|
kgr123/quality_counter_4000_4_simple | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 22008676
num_examples: 1929
- name: train
num_bytes: 21821375
num_examples: 1935
- name: validation
num_bytes: 22277198
num_examples: 1941
download_size: 14631301
dataset_size: 66107249
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
dianamihalache27/english_taskA | ---
license: mit
---
|
open-llm-leaderboard/details_arcee-ai__Saul-Instruct-Clown-7b | ---
pretty_name: Evaluation run of arcee-ai/Saul-Instruct-Clown-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [arcee-ai/Saul-Instruct-Clown-7b](https://huggingface.co/arcee-ai/Saul-Instruct-Clown-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_arcee-ai__Saul-Instruct-Clown-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T17:19:26.974956](https://huggingface.co/datasets/open-llm-leaderboard/details_arcee-ai__Saul-Instruct-Clown-7b/blob/main/results_2024-03-21T17-19-26.974956.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6479335537987747,\n\
\ \"acc_stderr\": 0.032126003622606675,\n \"acc_norm\": 0.6483816083543984,\n\
\ \"acc_norm_stderr\": 0.03278555571808547,\n \"mc1\": 0.4614443084455324,\n\
\ \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.6320459324850217,\n\
\ \"mc2_stderr\": 0.014970778525538173\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6484641638225256,\n \"acc_stderr\": 0.013952413699600938,\n\
\ \"acc_norm\": 0.6808873720136519,\n \"acc_norm_stderr\": 0.013621696119173307\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6694881497709619,\n\
\ \"acc_stderr\": 0.004694360968929403,\n \"acc_norm\": 0.8622784305915157,\n\
\ \"acc_norm_stderr\": 0.0034390323355350393\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115979,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115979\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n \"\
acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\"\
: 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970565,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970565\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.02220930907316562,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.02220930907316562\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834829,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834829\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468348,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468348\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37206703910614525,\n\
\ \"acc_stderr\": 0.016165847583563295,\n \"acc_norm\": 0.37206703910614525,\n\
\ \"acc_norm_stderr\": 0.016165847583563295\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.02456922360046085,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.02456922360046085\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.012741974333897226,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.012741974333897226\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6535947712418301,\n \"acc_stderr\": 0.01924978569171721,\n \
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.01924978569171721\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4614443084455324,\n\
\ \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.6320459324850217,\n\
\ \"mc2_stderr\": 0.014970778525538173\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8161010260457774,\n \"acc_stderr\": 0.010887916013305892\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6785443517816527,\n \
\ \"acc_stderr\": 0.012864471384836705\n }\n}\n```"
repo_url: https://huggingface.co/arcee-ai/Saul-Instruct-Clown-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|arc:challenge|25_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|gsm8k|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hellaswag|10_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T17-19-26.974956.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T17-19-26.974956.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- '**/details_harness|winogrande|5_2024-03-21T17-19-26.974956.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T17-19-26.974956.parquet'
- config_name: results
data_files:
- split: 2024_03_21T17_19_26.974956
path:
- results_2024-03-21T17-19-26.974956.parquet
- split: latest
path:
- results_2024-03-21T17-19-26.974956.parquet
---
# Dataset Card for Evaluation run of arcee-ai/Saul-Instruct-Clown-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [arcee-ai/Saul-Instruct-Clown-7b](https://huggingface.co/arcee-ai/Saul-Instruct-Clown-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_arcee-ai__Saul-Instruct-Clown-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T17:19:26.974956](https://huggingface.co/datasets/open-llm-leaderboard/details_arcee-ai__Saul-Instruct-Clown-7b/blob/main/results_2024-03-21T17-19-26.974956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6479335537987747,
"acc_stderr": 0.032126003622606675,
"acc_norm": 0.6483816083543984,
"acc_norm_stderr": 0.03278555571808547,
"mc1": 0.4614443084455324,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.6320459324850217,
"mc2_stderr": 0.014970778525538173
},
"harness|arc:challenge|25": {
"acc": 0.6484641638225256,
"acc_stderr": 0.013952413699600938,
"acc_norm": 0.6808873720136519,
"acc_norm_stderr": 0.013621696119173307
},
"harness|hellaswag|10": {
"acc": 0.6694881497709619,
"acc_stderr": 0.004694360968929403,
"acc_norm": 0.8622784305915157,
"acc_norm_stderr": 0.0034390323355350393
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.02494236893115979,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.02494236893115979
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970565,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970565
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.02220930907316562,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.02220930907316562
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834829,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834829
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468348,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468348
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37206703910614525,
"acc_stderr": 0.016165847583563295,
"acc_norm": 0.37206703910614525,
"acc_norm_stderr": 0.016165847583563295
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046626,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046626
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.02456922360046085,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.02456922360046085
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897226,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897226
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.01924978569171721,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.01924978569171721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4614443084455324,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.6320459324850217,
"mc2_stderr": 0.014970778525538173
},
"harness|winogrande|5": {
"acc": 0.8161010260457774,
"acc_stderr": 0.010887916013305892
},
"harness|gsm8k|5": {
"acc": 0.6785443517816527,
"acc_stderr": 0.012864471384836705
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AIGym/news | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 13828692
num_examples: 11314
download_size: 8908140
dataset_size: 13828692
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
skashyap96/autotrain-data-led-samsum-dialogsum | ---
task_categories:
- conditional-text-generation
---
# AutoTrain Dataset for project: led-samsum-dialogsum
## Dataset Description
This dataset has been automatically processed by AutoTrain for project led-samsum-dialogsum.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"feat_Unnamed: 0": 0,
"feat_id": 0,
"text": "Amanda: I baked cookies. Do you want some?\nJerry: Sure!\nAmanda: I'll bring you tomorrow :-)",
"target": "Amanda baked cookies and will bring Jerry some tomorrow."
},
{
"feat_Unnamed: 0": 1,
"feat_id": 1,
"text": "Olivia: Who are you voting for in this election? \nOliver: Liberals as always.\nOlivia: Me too!!\nOliver: Great",
"target": "Olivia and Olivier are voting for liberals in this election. "
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"feat_Unnamed: 0": "Value(dtype='int64', id=None)",
"feat_id": "Value(dtype='int64', id=None)",
"text": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 27191 |
| valid | 1318 |
|
DivGo/sentiment_analysis | ---
license: apache-2.0
---
|
helenqu/astro-time-series | ---
task_categories:
- feature-extraction
tags:
- astro
size_categories:
- 1M<n<10M
---
# Astronomical Time-Series Dataset
This is the full dataset of astronomical time-series from the 2018 Photometric LSST Astronomical Time-Series Classification Challenge (PLAsTiCC) Kaggle competition. There are 18 types of astronomical sources represented, including transient phenomena (e.g. supernovae, kilonovae) and variable objects (e.g. active galactic nuclei, Mira variables).
The original Kaggle competition can be found [here](https://www.kaggle.com/c/PLAsTiCC-2018). [This note](https://arxiv.org/abs/1810.00001) from the competition describes the dataset in detail. Astronomers may be interested in [this paper](https://arxiv.org/abs/1903.11756) describing the simulations used to generate the data.
## Dataset Structure
### Data Fields
- **object_id**: unique object identifier
- **times_wv**: 2D array of shape (N, 2) containing the observation times (modified Julian days, MJD) and filter (wavelength) for each observation, N=number of observations\
- **target**: 2D array of shape (N, 2) containing the flux (arbitrary units) and flux error for each observation\
- **label**: integer representing the class of the object (see below)\
- **redshift**: true redshift of the object\
- **ddf**: 1 if the object was in the deep drilling fields (DDF) survey area of LSST, 0 if wide-fast-deep (WFD)\
- **hostgal_specz**: spectroscopic redshift of the host galaxy\
- **hostgal_photoz**: photometric redshift of the host galaxy\
- **hostgal_photoz_err**: uncertainty on the photometric redshift
### Data Splits
The original PLAsTiCC challenge had a training set that was biased to be lower redshift, brighter, and higher signal-to-noise than the test set. This was created to emulate a spectroscopically confirmed subset of observations that typically would be used to train a machine learning classifier. The test set represents a realistic simulation of all LSST observations -- fainter and noisier than the training set. In this dataset, the original PLAsTiCC training set was split into 90/10 training/validation and the original test set was uploaded unchanged.
- **train**: 90% of the PLAsTiCC training set
- **validation**: 10% of the PLAsTiCC training set
- **test**: full PLAsTiCC test set
## Additional Information
### Class Descriptions
```
6: microlens-single
15: tidal disruption event (TDE)
16: eclipsing binary (EB)
42: type II supernova (SNII)
52: peculiar type Ia supernova (SNIax)
53: Mira variable
62: type Ibc supernova(SNIbc)
64: kilonova (KN)
65: M-dwarf
67: peculiar type Ia supernova (SNIa-91bg)
88: active galactic nuclei (AGN)
90: type Ia supernova (SNIa)
92: RR-Lyrae (RRL)
95: superluminous supernova (SLSN-I)
991: microlens-binary
992: intermediate luminosity optical transient (ILOT)
993: calcium-rich transient (CaRT)
994: pair instability supernova (PISN)
995: microlens-string
```
### Citation Information
```
@ARTICLE{2018arXiv181000001T,
author = {{The PLAsTiCC team} and {Allam}, Tarek, Jr. and {Bahmanyar}, Anita and {Biswas}, Rahul and {Dai}, Mi and {Galbany}, Llu{\'\i}s and {Hlo{\v{z}}ek}, Ren{\'e}e and {Ishida}, Emille E.~O. and {Jha}, Saurabh W. and {Jones}, David O. and {Kessler}, Richard and {Lochner}, Michelle and {Mahabal}, Ashish A. and {Malz}, Alex I. and {Mandel}, Kaisey S. and {Mart{\'\i}nez-Galarza}, Juan Rafael and {McEwen}, Jason D. and {Muthukrishna}, Daniel and {Narayan}, Gautham and {Peiris}, Hiranya and {Peters}, Christina M. and {Ponder}, Kara and {Setzer}, Christian N. and {The LSST Dark Energy Science Collaboration} and {LSST Transients}, The and {Variable Stars Science Collaboration}},
title = "{The Photometric LSST Astronomical Time-series Classification Challenge (PLAsTiCC): Data set}",
journal = {arXiv e-prints},
keywords = {Astrophysics - Instrumentation and Methods for Astrophysics, Astrophysics - Solar and Stellar Astrophysics},
year = 2018,
month = sep,
eid = {arXiv:1810.00001},
pages = {arXiv:1810.00001},
doi = {10.48550/arXiv.1810.00001},
archivePrefix = {arXiv},
eprint = {1810.00001},
primaryClass = {astro-ph.IM},
adsurl = {https://ui.adsabs.harvard.edu/abs/2018arXiv181000001T},
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}
``` |
open-llm-leaderboard/details_facebook__opt-iml-max-1.3b | ---
pretty_name: Evaluation run of facebook/opt-iml-max-1.3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [facebook/opt-iml-max-1.3b](https://huggingface.co/facebook/opt-iml-max-1.3b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_facebook__opt-iml-max-1.3b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T09:50:43.719660](https://huggingface.co/datasets/open-llm-leaderboard/details_facebook__opt-iml-max-1.3b/blob/main/results_2023-10-18T09-50-43.719660.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3028523489932886,\n\
\ \"em_stderr\": 0.0047056271048806315,\n \"f1\": 0.3369934983221478,\n\
\ \"f1_stderr\": 0.004663613383395755,\n \"acc\": 0.30375849777371944,\n\
\ \"acc_stderr\": 0.007878524617348554\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3028523489932886,\n \"em_stderr\": 0.0047056271048806315,\n\
\ \"f1\": 0.3369934983221478,\n \"f1_stderr\": 0.004663613383395755\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \
\ \"acc_stderr\": 0.0020013057209480856\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6022099447513812,\n \"acc_stderr\": 0.013755743513749023\n\
\ }\n}\n```"
repo_url: https://huggingface.co/facebook/opt-iml-max-1.3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|arc:challenge|25_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T09_50_43.719660
path:
- '**/details_harness|drop|3_2023-10-18T09-50-43.719660.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T09-50-43.719660.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T09_50_43.719660
path:
- '**/details_harness|gsm8k|5_2023-10-18T09-50-43.719660.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T09-50-43.719660.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hellaswag|10_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T09:51:53.668877.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_22T09_51_53.668877
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T09:51:53.668877.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-22T09:51:53.668877.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T09_50_43.719660
path:
- '**/details_harness|winogrande|5_2023-10-18T09-50-43.719660.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T09-50-43.719660.parquet'
- config_name: results
data_files:
- split: 2023_10_18T09_50_43.719660
path:
- results_2023-10-18T09-50-43.719660.parquet
- split: latest
path:
- results_2023-10-18T09-50-43.719660.parquet
---
# Dataset Card for Evaluation run of facebook/opt-iml-max-1.3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/facebook/opt-iml-max-1.3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [facebook/opt-iml-max-1.3b](https://huggingface.co/facebook/opt-iml-max-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_facebook__opt-iml-max-1.3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T09:50:43.719660](https://huggingface.co/datasets/open-llm-leaderboard/details_facebook__opt-iml-max-1.3b/blob/main/results_2023-10-18T09-50-43.719660.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3028523489932886,
"em_stderr": 0.0047056271048806315,
"f1": 0.3369934983221478,
"f1_stderr": 0.004663613383395755,
"acc": 0.30375849777371944,
"acc_stderr": 0.007878524617348554
},
"harness|drop|3": {
"em": 0.3028523489932886,
"em_stderr": 0.0047056271048806315,
"f1": 0.3369934983221478,
"f1_stderr": 0.004663613383395755
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.0020013057209480856
},
"harness|winogrande|5": {
"acc": 0.6022099447513812,
"acc_stderr": 0.013755743513749023
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
result-muse256-muse512-wuerst-sdv15/0b3e4624 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 224
num_examples: 10
download_size: 1395
dataset_size: 224
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "0b3e4624"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test-mathemakitt-c50da3-1597456333 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test
eval_info:
task: text_zero_shot_classification
model: facebook/opt-6.7b
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test
dataset_config: mathemakitten--winobias_antistereotype_test
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-6.7b
* Dataset: mathemakitten/winobias_antistereotype_test
* Config: mathemakitten--winobias_antistereotype_test
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-fast-instruct | ---
pretty_name: Evaluation run of elyza/ELYZA-japanese-Llama-2-7b-fast-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elyza/ELYZA-japanese-Llama-2-7b-fast-instruct](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-fast-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-fast-instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-18T13:15:23.023152](https://huggingface.co/datasets/open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-fast-instruct/blob/main/results_2023-09-18T13-15-23.023152.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n\
\ \"em_stderr\": 0.0002773614457335574,\n \"f1\": 0.05842596476510087,\n\
\ \"f1_stderr\": 0.0014351374704884914,\n \"acc\": 0.3893953528449777,\n\
\ \"acc_stderr\": 0.009682077684152727\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.0002773614457335574,\n\
\ \"f1\": 0.05842596476510087,\n \"f1_stderr\": 0.0014351374704884914\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06292645943896892,\n \
\ \"acc_stderr\": 0.00668876258153273\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7158642462509865,\n \"acc_stderr\": 0.012675392786772724\n\
\ }\n}\n```"
repo_url: https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-fast-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|arc:challenge|25_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T13_15_23.023152
path:
- '**/details_harness|drop|3_2023-09-18T13-15-23.023152.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T13-15-23.023152.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T13_15_23.023152
path:
- '**/details_harness|gsm8k|5_2023-09-18T13-15-23.023152.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T13-15-23.023152.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hellaswag|10_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T13_15_23.023152
path:
- '**/details_harness|winogrande|5_2023-09-18T13-15-23.023152.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T13-15-23.023152.parquet'
- config_name: results
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- results_2023-08-31T10:31:06.173852.parquet
- split: 2023_09_18T13_15_23.023152
path:
- results_2023-09-18T13-15-23.023152.parquet
- split: latest
path:
- results_2023-09-18T13-15-23.023152.parquet
---
# Dataset Card for Evaluation run of elyza/ELYZA-japanese-Llama-2-7b-fast-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-fast-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [elyza/ELYZA-japanese-Llama-2-7b-fast-instruct](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-fast-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-fast-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T13:15:23.023152](https://huggingface.co/datasets/open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-fast-instruct/blob/main/results_2023-09-18T13-15-23.023152.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335574,
"f1": 0.05842596476510087,
"f1_stderr": 0.0014351374704884914,
"acc": 0.3893953528449777,
"acc_stderr": 0.009682077684152727
},
"harness|drop|3": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335574,
"f1": 0.05842596476510087,
"f1_stderr": 0.0014351374704884914
},
"harness|gsm8k|5": {
"acc": 0.06292645943896892,
"acc_stderr": 0.00668876258153273
},
"harness|winogrande|5": {
"acc": 0.7158642462509865,
"acc_stderr": 0.012675392786772724
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
EleutherAI/quirky_sciq_bob_hard | ---
dataset_info:
features:
- name: id
dtype: string
- name: choices
sequence: string
- name: label
dtype: int64
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: character
dtype: string
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
splits:
- name: train
num_bytes: 746877.1236888566
num_examples: 1204
- name: validation
num_bytes: 132886.768
num_examples: 224
- name: test
num_bytes: 157867.276
num_examples: 266
download_size: 314191
dataset_size: 1037631.1676888566
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_MayaPH__FinOPT-Franklin | ---
pretty_name: Evaluation run of MayaPH/FinOPT-Franklin
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MayaPH/FinOPT-Franklin](https://huggingface.co/MayaPH/FinOPT-Franklin) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MayaPH__FinOPT-Franklin\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T03:49:57.107802](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__FinOPT-Franklin/blob/main/results_2023-10-18T03-49-57.107802.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0005243288590604027,\n\
\ \"em_stderr\": 0.00023443780464837331,\n \"f1\": 0.0010171979865771813,\n\
\ \"f1_stderr\": 0.0002699153689755448,\n \"acc\": 0.2525651144435675,\n\
\ \"acc_stderr\": 0.007025872980895256\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0005243288590604027,\n \"em_stderr\": 0.00023443780464837331,\n\
\ \"f1\": 0.0010171979865771813,\n \"f1_stderr\": 0.0002699153689755448\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.505130228887135,\n\
\ \"acc_stderr\": 0.014051745961790513\n }\n}\n```"
repo_url: https://huggingface.co/MayaPH/FinOPT-Franklin
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|arc:challenge|25_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T03_49_57.107802
path:
- '**/details_harness|drop|3_2023-10-18T03-49-57.107802.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T03-49-57.107802.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T03_49_57.107802
path:
- '**/details_harness|gsm8k|5_2023-10-18T03-49-57.107802.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T03-49-57.107802.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hellaswag|10_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T12:10:37.381661.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T12:10:37.381661.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T12:10:37.381661.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T03_49_57.107802
path:
- '**/details_harness|winogrande|5_2023-10-18T03-49-57.107802.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T03-49-57.107802.parquet'
- config_name: results
data_files:
- split: 2023_07_19T12_10_37.381661
path:
- results_2023-07-19T12:10:37.381661.parquet
- split: 2023_10_18T03_49_57.107802
path:
- results_2023-10-18T03-49-57.107802.parquet
- split: latest
path:
- results_2023-10-18T03-49-57.107802.parquet
---
# Dataset Card for Evaluation run of MayaPH/FinOPT-Franklin
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/MayaPH/FinOPT-Franklin
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [MayaPH/FinOPT-Franklin](https://huggingface.co/MayaPH/FinOPT-Franklin) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MayaPH__FinOPT-Franklin",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T03:49:57.107802](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__FinOPT-Franklin/blob/main/results_2023-10-18T03-49-57.107802.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0005243288590604027,
"em_stderr": 0.00023443780464837331,
"f1": 0.0010171979865771813,
"f1_stderr": 0.0002699153689755448,
"acc": 0.2525651144435675,
"acc_stderr": 0.007025872980895256
},
"harness|drop|3": {
"em": 0.0005243288590604027,
"em_stderr": 0.00023443780464837331,
"f1": 0.0010171979865771813,
"f1_stderr": 0.0002699153689755448
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.505130228887135,
"acc_stderr": 0.014051745961790513
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dgallitelli/multilingual-wealth-alpaca | ---
license: mit
task_categories:
- text-generation
language:
- en
- it
- fr
- es
---
# Multilingual Wealth Alpaca Dataset

Work derivative of [gbharti/wealth-alpaca_lora dataset](https://huggingface.co/datasets/gbharti/wealth-alpaca_lora). The original dataset is a combination of Stanford's Alpaca (https://github.com/tatsu-lab/stanford_alpaca) and FiQA (https://sites.google.com/view/fiqa/) with another 1.3k pairs custom generated using GPT3.5 . This version is a cleaned up version, which also has:
- mutlilingual support (en, it, fr, es, de)
- CSV and JSON files
|
SKT27182/NER | ---
license: mit
dataset_info:
features:
- name: ID
dtype: string
- name: tags
dtype: string
- name: text
dtype: string
- name: dataset_num
dtype: int64
- name: tokens
sequence: string
- name: ner_tags
sequence: float64
splits:
- name: train
num_bytes: 8709521
num_examples: 19709
download_size: 2626890
dataset_size: 8709521
---
|
niryuu/sni-each-converted | ---
language:
- en
license: apache-2.0
---
Converted Super-NaturalInstructions to jsonl
https://github.com/allenai/natural-instructions |
vlsp-2023-vllm/exams_lichsu | ---
dataset_info:
features:
- name: question
dtype: string
- name: id
dtype: string
- name: choices
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: answerKey
dtype: string
- name: metadata
struct:
- name: grade
dtype: string
- name: subject
dtype: string
splits:
- name: test
num_bytes: 2291100
num_examples: 5350
download_size: 1044296
dataset_size: 2291100
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "exams_lichsu"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
confit/esc50 | ---
dataset_info:
- config_name: fold1
features:
- name: audio
dtype:
audio:
sampling_rate: 44100
- name: sound
dtype: string
- name: label
dtype:
class_label:
names:
'0': airplane
'1': breathing
'2': brushing_teeth
'3': can_opening
'4': car_horn
'5': cat
'6': chainsaw
'7': chirping_birds
'8': church_bells
'9': clapping
'10': clock_alarm
'11': clock_tick
'12': coughing
'13': cow
'14': crackling_fire
'15': crickets
'16': crow
'17': crying_baby
'18': dog
'19': door_wood_creaks
'20': door_wood_knock
'21': drinking_sipping
'22': engine
'23': fireworks
'24': footsteps
'25': frog
'26': glass_breaking
'27': hand_saw
'28': helicopter
'29': hen
'30': insects
'31': keyboard_typing
'32': laughing
'33': mouse_click
'34': pig
'35': pouring_water
'36': rain
'37': rooster
'38': sea_waves
'39': sheep
'40': siren
'41': sneezing
'42': snoring
'43': thunderstorm
'44': toilet_flush
'45': train
'46': vacuum_cleaner
'47': washing_machine
'48': water_drops
'49': wind
splits:
- name: train
num_bytes: 705710450.2
num_examples: 1600
- name: test
num_bytes: 176427616
num_examples: 400
download_size: 773383933
dataset_size: 882138066.2
- config_name: fold2
features:
- name: audio
dtype:
audio:
sampling_rate: 44100
- name: sound
dtype: string
- name: label
dtype:
class_label:
names:
'0': airplane
'1': breathing
'2': brushing_teeth
'3': can_opening
'4': car_horn
'5': cat
'6': chainsaw
'7': chirping_birds
'8': church_bells
'9': clapping
'10': clock_alarm
'11': clock_tick
'12': coughing
'13': cow
'14': crackling_fire
'15': crickets
'16': crow
'17': crying_baby
'18': dog
'19': door_wood_creaks
'20': door_wood_knock
'21': drinking_sipping
'22': engine
'23': fireworks
'24': footsteps
'25': frog
'26': glass_breaking
'27': hand_saw
'28': helicopter
'29': hen
'30': insects
'31': keyboard_typing
'32': laughing
'33': mouse_click
'34': pig
'35': pouring_water
'36': rain
'37': rooster
'38': sea_waves
'39': sheep
'40': siren
'41': sneezing
'42': snoring
'43': thunderstorm
'44': toilet_flush
'45': train
'46': vacuum_cleaner
'47': washing_machine
'48': water_drops
'49': wind
splits:
- name: train
num_bytes: 705710467.8
num_examples: 1600
- name: test
num_bytes: 176427616
num_examples: 400
download_size: 773374873
dataset_size: 882138083.8
- config_name: fold3
features:
- name: audio
dtype:
audio:
sampling_rate: 44100
- name: sound
dtype: string
- name: label
dtype:
class_label:
names:
'0': airplane
'1': breathing
'2': brushing_teeth
'3': can_opening
'4': car_horn
'5': cat
'6': chainsaw
'7': chirping_birds
'8': church_bells
'9': clapping
'10': clock_alarm
'11': clock_tick
'12': coughing
'13': cow
'14': crackling_fire
'15': crickets
'16': crow
'17': crying_baby
'18': dog
'19': door_wood_creaks
'20': door_wood_knock
'21': drinking_sipping
'22': engine
'23': fireworks
'24': footsteps
'25': frog
'26': glass_breaking
'27': hand_saw
'28': helicopter
'29': hen
'30': insects
'31': keyboard_typing
'32': laughing
'33': mouse_click
'34': pig
'35': pouring_water
'36': rain
'37': rooster
'38': sea_waves
'39': sheep
'40': siren
'41': sneezing
'42': snoring
'43': thunderstorm
'44': toilet_flush
'45': train
'46': vacuum_cleaner
'47': washing_machine
'48': water_drops
'49': wind
splits:
- name: train
num_bytes: 705710462
num_examples: 1600
- name: test
num_bytes: 176427616
num_examples: 400
download_size: 773552360
dataset_size: 882138078
- config_name: fold4
features:
- name: audio
dtype:
audio:
sampling_rate: 44100
- name: sound
dtype: string
- name: label
dtype:
class_label:
names:
'0': airplane
'1': breathing
'2': brushing_teeth
'3': can_opening
'4': car_horn
'5': cat
'6': chainsaw
'7': chirping_birds
'8': church_bells
'9': clapping
'10': clock_alarm
'11': clock_tick
'12': coughing
'13': cow
'14': crackling_fire
'15': crickets
'16': crow
'17': crying_baby
'18': dog
'19': door_wood_creaks
'20': door_wood_knock
'21': drinking_sipping
'22': engine
'23': fireworks
'24': footsteps
'25': frog
'26': glass_breaking
'27': hand_saw
'28': helicopter
'29': hen
'30': insects
'31': keyboard_typing
'32': laughing
'33': mouse_click
'34': pig
'35': pouring_water
'36': rain
'37': rooster
'38': sea_waves
'39': sheep
'40': siren
'41': sneezing
'42': snoring
'43': thunderstorm
'44': toilet_flush
'45': train
'46': vacuum_cleaner
'47': washing_machine
'48': water_drops
'49': wind
splits:
- name: train
num_bytes: 705710450
num_examples: 1600
- name: test
num_bytes: 176427616
num_examples: 400
download_size: 773258954
dataset_size: 882138066
- config_name: fold5
features:
- name: audio
dtype:
audio:
sampling_rate: 44100
- name: sound
dtype: string
- name: label
dtype:
class_label:
names:
'0': airplane
'1': breathing
'2': brushing_teeth
'3': can_opening
'4': car_horn
'5': cat
'6': chainsaw
'7': chirping_birds
'8': church_bells
'9': clapping
'10': clock_alarm
'11': clock_tick
'12': coughing
'13': cow
'14': crackling_fire
'15': crickets
'16': crow
'17': crying_baby
'18': dog
'19': door_wood_creaks
'20': door_wood_knock
'21': drinking_sipping
'22': engine
'23': fireworks
'24': footsteps
'25': frog
'26': glass_breaking
'27': hand_saw
'28': helicopter
'29': hen
'30': insects
'31': keyboard_typing
'32': laughing
'33': mouse_click
'34': pig
'35': pouring_water
'36': rain
'37': rooster
'38': sea_waves
'39': sheep
'40': siren
'41': sneezing
'42': snoring
'43': thunderstorm
'44': toilet_flush
'45': train
'46': vacuum_cleaner
'47': washing_machine
'48': water_drops
'49': wind
splits:
- name: train
num_bytes: 705710464.4
num_examples: 1600
- name: test
num_bytes: 176427616
num_examples: 400
download_size: 773395386
dataset_size: 882138080.4
configs:
- config_name: fold1
data_files:
- split: train
path: fold1/train-*
- split: test
path: fold1/test-*
- config_name: fold2
data_files:
- split: train
path: fold2/train-*
- split: test
path: fold2/test-*
- config_name: fold3
data_files:
- split: train
path: fold3/train-*
- split: test
path: fold3/test-*
- config_name: fold4
data_files:
- split: train
path: fold4/train-*
- split: test
path: fold4/test-*
- config_name: fold5
data_files:
- split: train
path: fold5/train-*
- split: test
path: fold5/test-*
task_categories:
- audio-classification
tags:
- audio
- multiclass
--- |
JM-Lee/Understanding_full_alpaca | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: answer
dtype: string
- name: generated
dtype: string
- name: understanding
dtype: string
splits:
- name: train
num_bytes: 2164
num_examples: 1
download_size: 16229
dataset_size: 2164
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vuducanh/b3-userstudy-data | ---
license: mit
---
dataset sources:
shark_dataset_location = "https://www.kaggle.com/datasets/mysarahmadbhat/shark-attacks"
nba_dataset_location = "https://zenodo.org/record/6419727"
fec_dataset_location = "https://github.com/wesm/pydata-book/blob/2nd-edition/datasets/fec/P00000001-ALL.csv"
|
CyberHarem/nagara_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nagara/長良/长良 (Azur Lane)
This is the dataset of nagara/長良/长良 (Azur Lane), containing 108 images and their tags.
The core tags of this character are `breasts, hair_ornament, hairclip, horns, long_hair, twintails, hair_between_eyes, large_breasts, brown_eyes, black_hair, bow, bangs, ribbon, red_bow, low_twintails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 108 | 131.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagara_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 108 | 79.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagara_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 273 | 174.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagara_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 108 | 119.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagara_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 273 | 239.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagara_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nagara_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 69 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, cardigan, white_shirt, long_sleeves, pleated_skirt, bell, open_mouth, white_background, simple_background, school_uniform, collared_shirt, black_skirt, :d, brown_hair, red_bowtie, button_gap, hair_ribbon |
| 1 | 7 |  |  |  |  |  | 1boy, 1girl, blush, hetero, hair_bow, nipples, penis, smile, pov, solo_focus, heart, looking_at_viewer, mosaic_censoring, open_mouth, paizuri, sweat, breasts_squeezed_together, male_pubic_hair, sex, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | blush | cardigan | white_shirt | long_sleeves | pleated_skirt | bell | open_mouth | white_background | simple_background | school_uniform | collared_shirt | black_skirt | :d | brown_hair | red_bowtie | button_gap | hair_ribbon | 1boy | hetero | hair_bow | nipples | penis | smile | pov | solo_focus | heart | mosaic_censoring | paizuri | sweat | breasts_squeezed_together | male_pubic_hair | sex | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:-----------|:--------------|:---------------|:----------------|:-------|:-------------|:-------------------|:--------------------|:-----------------|:-----------------|:--------------|:-----|:-------------|:-------------|:-------------|:--------------|:-------|:---------|:-----------|:----------|:--------|:--------|:------|:-------------|:--------|:-------------------|:----------|:--------|:----------------------------|:------------------|:------|:----------|
| 0 | 69 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | X | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
iocuydi/amharic-alpaca | ---
license: apache-2.0
---
More details: https://arxiv.org/abs/2403.06354 |
Suyogyart/np20ng | ---
annotations_creators:
- other
language:
- ne
language_creators:
- machine-generated
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: np20ng
size_categories:
- 100K<n<1M
source_datasets:
- original
tags:
- nepali-newsgroups
- nepali-20-newsgroups
- np20ng
- nepali text classification
- natural language processing
- news
- headline
task_categories:
- text-classification
task_ids:
- multi-class-classification
---
# Dataset Card for [np20ng]
## Table of Contents
- [Dataset Card for [np20ng]](#dataset-card-for-dataset-name)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** To be updated
- **Repository:** To be updated
- **Paper:** Submitted for review
- **Leaderboard:** To be updated
- **Point of Contact:** To be updated
### Dataset Summary
This is a multi-class Nepali text classification dataset. Text are the news documents and labels are the news categories. It consists over 200,000 documents categorized into 20 different Nepali news groups. News documents from 10 different news sources are compiled into this dataset. Labeling is done using the category-specific news from the respective news portals.
### Supported Tasks and Leaderboards
- Multi-class text classification from news document
- Multi-class text classification from news headings
- News heading generation from news document
### Languages
- Nepali
## Dataset Structure
### Data Instances
The dataset consists over 200,000 Nepali news documents categorized into 20 different news categories.
### Data Fields
- **category:** News category
- **content:** News document (main text)
- **headline:** News headline
- **source:** News source from where the news is taken from
### Data Splits
The dataset is a whole dataset and is not splitted.
## Dataset Creation
### Curation Rationale
To develop and create a large-scale Nepali text classification dataset and releasing it to the public for further research and developments
### Source Data
#### Initial Data Collection and Normalization
Data are scraped from popular Nepali news portals such as Onlinekhabar, Nepalkhabar, Ekantipur, Ratopati, Gorkhapatra, Nepalipatra, Educationpati, Crimenews, etc.
#### Who are the source language producers?
News portals
### Annotations
#### Annotation process
Category labeling of news documents are automatically done as the documents are scraped from category-specific URLs of particular news source
#### Who are the annotators?
News portals
### Personal and Sensitive Information
This dataset does not possess any personal and sensitive information. However, the news content can possess some biasness and irregular information which might be sensitive and not quite related with the original author of the dataset
## Considerations for Using the Data
### Social Impact of Dataset
No issues.
### Discussion of Biases
Categories can be depended on how news portals have categorized them. Otherwise could cause some bias between them.
### Other Known Limitations
News summary are not included
## Additional Information
### Dataset Curators
Me myself.
### Licensing Information
Apache-2.0
### Citation Information
To be updated later (Paper submission in process)
### Contributions
Thanks to [@Suyogyart](https://github.com/Suyogyart) for adding this dataset.
|
ALTACambridge/KUPA-KEYS | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-classification
language:
- en
pretty_name: KUPA-KEYS
size_categories:
- 1K<n<10K
---
This repository hosts the dataset collected during the project, 'Deep Learning for Language Assessment', as detailed in the paper "Logging Keystrokes in Writing by English Leaners", to appear in the proceedings of LREC-COLING 2024.
The dataset is named **KUPA-KEYS** (King's College London & Université Paris Cité Keys). It contains texts written by 1,006 participants in our crowdsourcing study, recruited on Prolific. Task 1 involved a text-copy task; Task 2 involved essay writing in response to a 'Just for Fun' prompt from [Write & Improve](https://writeandimprove.com/), used with permission. Keystroke data for these texts are included in the dataset, as well as metadata and CEFR level grades for the free-text essays. Further details about the data collection process, annotation and analysis may be found in our LREC-COLING paper.
_Georgios Velentzas, Andrew Caines, Rita Borgo, Erin Pacquetet, Clive Hamilton, Taylor Arnold, Diane Nicholls, Paula Buttery, Thomas Gaillat, Nicolas Ballier and Helen Yannakoudakis_ |
open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.1 | ---
pretty_name: Evaluation run of jondurbin/airoboros-13b-gpt4-1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-13b-gpt4-1.1](https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T21:49:14.106154](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.1/blob/main/results_2023-10-22T21-49-14.106154.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.037017617449664426,\n\
\ \"em_stderr\": 0.0019335395228219918,\n \"f1\": 0.09976300335570489,\n\
\ \"f1_stderr\": 0.0023092531505962102,\n \"acc\": 0.4197877778063671,\n\
\ \"acc_stderr\": 0.009797345526945866\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.037017617449664426,\n \"em_stderr\": 0.0019335395228219918,\n\
\ \"f1\": 0.09976300335570489,\n \"f1_stderr\": 0.0023092531505962102\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08188021228203184,\n \
\ \"acc_stderr\": 0.007552338527716947\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174785\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T21_49_14.106154
path:
- '**/details_harness|drop|3_2023-10-22T21-49-14.106154.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T21-49-14.106154.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T21_49_14.106154
path:
- '**/details_harness|gsm8k|5_2023-10-22T21-49-14.106154.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T21-49-14.106154.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T21_49_14.106154
path:
- '**/details_harness|winogrande|5_2023-10-22T21-49-14.106154.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T21-49-14.106154.parquet'
- config_name: results
data_files:
- split: 2023_10_22T21_49_14.106154
path:
- results_2023-10-22T21-49-14.106154.parquet
- split: latest
path:
- results_2023-10-22T21-49-14.106154.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-13b-gpt4-1.1](https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T21:49:14.106154](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.1/blob/main/results_2023-10-22T21-49-14.106154.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.037017617449664426,
"em_stderr": 0.0019335395228219918,
"f1": 0.09976300335570489,
"f1_stderr": 0.0023092531505962102,
"acc": 0.4197877778063671,
"acc_stderr": 0.009797345526945866
},
"harness|drop|3": {
"em": 0.037017617449664426,
"em_stderr": 0.0019335395228219918,
"f1": 0.09976300335570489,
"f1_stderr": 0.0023092531505962102
},
"harness|gsm8k|5": {
"acc": 0.08188021228203184,
"acc_stderr": 0.007552338527716947
},
"harness|winogrande|5": {
"acc": 0.7576953433307024,
"acc_stderr": 0.012042352526174785
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
erkanxyzalaca/turkishKuran | ---
dataset_info:
features:
- name: Ayet
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 255726.9
num_examples: 738
- name: validation
num_bytes: 28414.1
num_examples: 82
download_size: 0
dataset_size: 284141.0
---
# Dataset Card for "turkishKuran"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
result-kand2-sdxl-wuerst-karlo/083be228 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 176
num_examples: 10
download_size: 1349
dataset_size: 176
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "083be228"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
naksidil/turkishReviews-ds-mini | ---
dataset_info:
features:
- name: review
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 1252876.2642514652
num_examples: 3378
- name: validation
num_bytes: 139455.7357485349
num_examples: 376
download_size: 896651
dataset_size: 1392332.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_zarakiquemparte__zaraxe-l2-7b | ---
pretty_name: Evaluation run of zarakiquemparte/zaraxe-l2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [zarakiquemparte/zaraxe-l2-7b](https://huggingface.co/zarakiquemparte/zaraxe-l2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zarakiquemparte__zaraxe-l2-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T11:25:34.979979](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zaraxe-l2-7b/blob/main/results_2023-09-23T11-25-34.979979.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19169463087248323,\n\
\ \"em_stderr\": 0.004031181549439802,\n \"f1\": 0.27804110738255156,\n\
\ \"f1_stderr\": 0.0041099263816090316,\n \"acc\": 0.4053108206032529,\n\
\ \"acc_stderr\": 0.00984887759467774\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.19169463087248323,\n \"em_stderr\": 0.004031181549439802,\n\
\ \"f1\": 0.27804110738255156,\n \"f1_stderr\": 0.0041099263816090316\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0758150113722517,\n \
\ \"acc_stderr\": 0.0072912057231626195\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.01240654946619286\n\
\ }\n}\n```"
repo_url: https://huggingface.co/zarakiquemparte/zaraxe-l2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|arc:challenge|25_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T11_25_34.979979
path:
- '**/details_harness|drop|3_2023-09-23T11-25-34.979979.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T11-25-34.979979.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T11_25_34.979979
path:
- '**/details_harness|gsm8k|5_2023-09-23T11-25-34.979979.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T11-25-34.979979.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hellaswag|10_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T21:46:04.335707.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_23T21_46_04.335707
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T21:46:04.335707.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-23T21:46:04.335707.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T11_25_34.979979
path:
- '**/details_harness|winogrande|5_2023-09-23T11-25-34.979979.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T11-25-34.979979.parquet'
- config_name: results
data_files:
- split: 2023_09_23T11_25_34.979979
path:
- results_2023-09-23T11-25-34.979979.parquet
- split: latest
path:
- results_2023-09-23T11-25-34.979979.parquet
---
# Dataset Card for Evaluation run of zarakiquemparte/zaraxe-l2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/zarakiquemparte/zaraxe-l2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [zarakiquemparte/zaraxe-l2-7b](https://huggingface.co/zarakiquemparte/zaraxe-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zarakiquemparte__zaraxe-l2-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T11:25:34.979979](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zaraxe-l2-7b/blob/main/results_2023-09-23T11-25-34.979979.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.19169463087248323,
"em_stderr": 0.004031181549439802,
"f1": 0.27804110738255156,
"f1_stderr": 0.0041099263816090316,
"acc": 0.4053108206032529,
"acc_stderr": 0.00984887759467774
},
"harness|drop|3": {
"em": 0.19169463087248323,
"em_stderr": 0.004031181549439802,
"f1": 0.27804110738255156,
"f1_stderr": 0.0041099263816090316
},
"harness|gsm8k|5": {
"acc": 0.0758150113722517,
"acc_stderr": 0.0072912057231626195
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.01240654946619286
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_stsb_indefinite_for_definite_articles | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 140791
num_examples: 791
- name: test
num_bytes: 87585
num_examples: 529
- name: train
num_bytes: 375034
num_examples: 2073
download_size: 378363
dataset_size: 603410
---
# Dataset Card for "MULTI_VALUE_stsb_indefinite_for_definite_articles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_AGI-0__Magistral-7B-v0.1 | ---
pretty_name: Evaluation run of AGI-0/Magistral-7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AGI-0/Magistral-7B-v0.1](https://huggingface.co/AGI-0/Magistral-7B-v0.1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AGI-0__Magistral-7B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-02T00:33:40.342861](https://huggingface.co/datasets/open-llm-leaderboard/details_AGI-0__Magistral-7B-v0.1/blob/main/results_2024-03-02T00-33-40.342861.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6468927390297269,\n\
\ \"acc_stderr\": 0.03225015812358322,\n \"acc_norm\": 0.6471943152304395,\n\
\ \"acc_norm_stderr\": 0.03291969038038486,\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6139311896012898,\n\
\ \"mc2_stderr\": 0.015078485729905217\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6331058020477816,\n \"acc_stderr\": 0.0140841331181043,\n\
\ \"acc_norm\": 0.6715017064846417,\n \"acc_norm_stderr\": 0.013724978465537304\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6728739294961164,\n\
\ \"acc_stderr\": 0.004682048906622317,\n \"acc_norm\": 0.862975502887871,\n\
\ \"acc_norm_stderr\": 0.003431704298641853\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n\
\ \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n\
\ \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8232323232323232,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.8232323232323232,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.0295973297309781,\n \
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.0295973297309781\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474086,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474086\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608304,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608304\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.02370309952525817,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.02370309952525817\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48268156424581005,\n\
\ \"acc_stderr\": 0.01671246744170252,\n \"acc_norm\": 0.48268156424581005,\n\
\ \"acc_norm_stderr\": 0.01671246744170252\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967273,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967273\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n\
\ \"acc_stderr\": 0.01270058240476822,\n \"acc_norm\": 0.44784876140808344,\n\
\ \"acc_norm_stderr\": 0.01270058240476822\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \
\ \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106568,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106568\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6139311896012898,\n\
\ \"mc2_stderr\": 0.015078485729905217\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237435\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6694465504169825,\n \
\ \"acc_stderr\": 0.012957496367085026\n }\n}\n```"
repo_url: https://huggingface.co/AGI-0/Magistral-7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|arc:challenge|25_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|gsm8k|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hellaswag|10_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T00-33-40.342861.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T00-33-40.342861.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- '**/details_harness|winogrande|5_2024-03-02T00-33-40.342861.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-02T00-33-40.342861.parquet'
- config_name: results
data_files:
- split: 2024_03_02T00_33_40.342861
path:
- results_2024-03-02T00-33-40.342861.parquet
- split: latest
path:
- results_2024-03-02T00-33-40.342861.parquet
---
# Dataset Card for Evaluation run of AGI-0/Magistral-7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AGI-0/Magistral-7B-v0.1](https://huggingface.co/AGI-0/Magistral-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AGI-0__Magistral-7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-02T00:33:40.342861](https://huggingface.co/datasets/open-llm-leaderboard/details_AGI-0__Magistral-7B-v0.1/blob/main/results_2024-03-02T00-33-40.342861.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6468927390297269,
"acc_stderr": 0.03225015812358322,
"acc_norm": 0.6471943152304395,
"acc_norm_stderr": 0.03291969038038486,
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6139311896012898,
"mc2_stderr": 0.015078485729905217
},
"harness|arc:challenge|25": {
"acc": 0.6331058020477816,
"acc_stderr": 0.0140841331181043,
"acc_norm": 0.6715017064846417,
"acc_norm_stderr": 0.013724978465537304
},
"harness|hellaswag|10": {
"acc": 0.6728739294961164,
"acc_stderr": 0.004682048906622317,
"acc_norm": 0.862975502887871,
"acc_norm_stderr": 0.003431704298641853
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8232323232323232,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.8232323232323232,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121427,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121427
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.0295973297309781,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.0295973297309781
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474086,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474086
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608304,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608304
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.02370309952525817,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.02370309952525817
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48268156424581005,
"acc_stderr": 0.01671246744170252,
"acc_norm": 0.48268156424581005,
"acc_norm_stderr": 0.01671246744170252
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967273,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967273
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44784876140808344,
"acc_stderr": 0.01270058240476822,
"acc_norm": 0.44784876140808344,
"acc_norm_stderr": 0.01270058240476822
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.02927956741106568,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.02927956741106568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6139311896012898,
"mc2_stderr": 0.015078485729905217
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.010430917468237435
},
"harness|gsm8k|5": {
"acc": 0.6694465504169825,
"acc_stderr": 0.012957496367085026
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-eval-jeffdshen__redefine_math2_8shot-jeffdshen__redefine_mat-af4c71-1853163409 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- jeffdshen/redefine_math2_8shot
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-1.3b_eval
metrics: []
dataset_name: jeffdshen/redefine_math2_8shot
dataset_config: jeffdshen--redefine_math2_8shot
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-1.3b_eval
* Dataset: jeffdshen/redefine_math2_8shot
* Config: jeffdshen--redefine_math2_8shot
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@jeffdshen](https://huggingface.co/jeffdshen) for evaluating this model. |
hmao/new_vt_apis | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: args_dicts
list:
- name: default
dtype: string
- name: description
dtype: string
- name: name
dtype: string
- name: required
dtype: bool
- name: type
dtype: string
- name: api_type
dtype: string
- name: description
dtype: string
- name: name
dtype: string
- name: dataset
dtype: string
splits:
- name: train
num_bytes: 20764
num_examples: 29
download_size: 14860
dataset_size: 20764
---
# Dataset Card for "new_vt_apis"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-44500 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 2111582021
num_examples: 500
download_size: 462600129
dataset_size: 2111582021
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EMBO/sd-nlp-non-tokenized | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets: []
task_categories:
- token-classification
- text-classification
task_ids:
- multi-class-classification
- named-entity-recognition
- parsing
---
# Dataset Card for sd-nlp
## Table of Contents
- [Dataset Card for [EMBO/sd-nlp-non-tokenized]](#dataset-card-for-dataset-name)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://sourcedata.embo.org
- **Repository:** https://github.com/source-data/soda-roberta
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** thomas.lemberger@embo.org, jorge.abreu@embo.org
### Dataset Summary
This dataset is based on the content of the SourceData (https://sourcedata.embo.org) database, which contains manually annotated figure legends written in English and extracted from scientific papers in the domain of cell and molecular biology (Liechti et al, Nature Methods, 2017, https://doi.org/10.1038/nmeth.4471).
Unlike the dataset [`sd-nlp`](https://huggingface.co/datasets/EMBO/sd-nlp), pre-tokenized with the `roberta-base` tokenizer, this dataset is not previously tokenized, but just splitted into words. Users can therefore use it to fine-tune other models.
Additional details at https://github.com/source-data/soda-roberta
### Supported Tasks and Leaderboards
Tags are provided as [IOB2-style tags](https://en.wikipedia.org/wiki/Inside%E2%80%93outside%E2%80%93beginning_(tagging)).
`PANELIZATION`: figure captions (or figure legends) are usually composed of segments that each refer to one of several 'panels' of the full figure. Panels tend to represent results obtained with a coherent method and depicts data points that can be meaningfully compared to each other. `PANELIZATION` provide the start (B-PANEL_START) of these segments and allow to train for recogntion of the boundary between consecutive panel lengends.
`NER`: biological and chemical entities are labeled. Specifically the following entities are tagged:
- `SMALL_MOLECULE`: small molecules
- `GENEPROD`: gene products (genes and proteins)
- `SUBCELLULAR`: subcellular components
- `CELL`: cell types and cell lines.
- `TISSUE`: tissues and organs
- `ORGANISM`: species
- `DISEASE`: diseases (see limitations)
- `EXP_ASSAY`: experimental assays
`ROLES`: the role of entities with regard to the causal hypotheses tested in the reported results. The tags are:
- `CONTROLLED_VAR`: entities that are associated with experimental variables and that subjected to controlled and targeted perturbations.
- `MEASURED_VAR`: entities that are associated with the variables measured and the object of the measurements.
`BORING`: entities are marked with the tag `BORING` when they are more of descriptive value and not directly associated with causal hypotheses ('boring' is not an ideal choice of word, but it is short...). Typically, these entities are so-called 'reporter' geneproducts, entities used as common baseline across samples, or specify the context of the experiment (cellular system, species, etc...).
### Languages
The text in the dataset is English.
## Dataset Structure
### Data Instances
```json
{
"words": [
".", "Figure", "6", "(", "A", ")", "Cisplatin", "dose", "response", "curves", "of", "(", "i", ")", "MB002", ",", "(", "ii", ")", "Daoy", ",", "and", "(", "iii", ")", "MIC", "in", "the", "absence", "(", "EV", ")", "or", "presence", "of", "SOX9", "by", "Alamar", "blue", ".", "Cells", "were", "pre", "-", "conditioned", "with", "doxycycline", "to", "induce", "expression", "of", "SOX9", "(", "or", "EV", ")", "prior", "to", "treatment", "with", "increasing", "concentrations", "of", "cisplatin", ".", "The", "IC50", "were", "calculated", "following", "5", "(", "MB002", "and", "MIC", ")", "or", "3", "days", "(", "Daoy", ")", "of", "treatment", ".", "Data", "are", "mean", "+", "standard", "deviation", "from", "3", "independent", "repeats", ",", "each", "containing", "5", "technical", "replicates", ".", "(", "B", ")", "Cisplatin", "dose", "response", "curves", "of", "SOX9", "-", "expressing", "(", "i", ")", "Daoy", "and", "(", "ii", ")", "MIC", "in", "the", "absence", "or", "presence", "of", "FBW7\u03b1", ".", "Experiments", "and", "data", "analysis", "were", "performed", "as", "described", "in", "(", "A", ")", "(", "C", ")", "Overall", "survival", "analysis", "of", "mice", "bearing", "Daoy", "or", "Daoy", "-", "expressing", "dox", "-", "inducible", "SOX9", "treated", "with", "cisplatin", ".", "The", "dox", "-", "preconditioned", "cells", "(", "105", "cells", ")", "were", "orthotopically", "xenografted", "to", "Nude", "-", "Foxn1nu", "mice", "and", "left", "for", "1", "week", "to", "prior", "to", "being", "treated", "with", "vehicle", "control", "or", "cisplatin", "(", "2mg", "/", "kg", ")", "intraperitoneally", "for", "every", "other", "day", "for", "a", "total", "of", "6", "doses", ".", "(", "D", ")", "Heat", "map", "of", "the", "row", "-", "wise", "z", "-", "scores", "of", "11", "genes", "associated", "with", "cisplatin", "resistance", "in", "MB002", "expressing", "Sox9", "-", "WT", "or", "Sox9", "-", "T236", "/", "T240A", ".", "Heat", "map", "was", "generated", "using", "the", "GenePattern", "software", ".", "(", "E", ")", "Quantitative", "analysis", "of", "ATP7A", ",", "DUSP2", ",", "and", "TTK", "mRNAs", "in", "MB002", "following", "expression", "of", "SOX9", "-", "WT", "or", "SOX9", "-", "T236", "/", "240A", ".", "Total", "RNA", "were", "collected", "24", "hours", "following", "doxycycline", "treatment", ",", "from", "which", "cDNA", "were", "generated", "for", "qPCR", ".", "Data", "are", "mean", "mRNA", "level", "(", "normalized", "to", "B2M", "transcript", ")", "+", "standard", "deviation", "from", "3", "independent", "experiments", "with", "statistical", "significance", "were", "determined", "by", "Multiple", "comparisons", "2", "-", "way", "ANOVA", "with", "Bonferroni", "'", "s", "post", "-", "test", ".", "(", "F", ")", "Time", "course", "western", "blotting", "of", "HA", "-", "SOX9", ",", "ATP7A", ",", "DUSP2", ",", "ERK1", "/", "2", "pThr202", "/", "Tyr204", "and", "total", "ERK1", "/", "2", "in", "MB002", "cells", "following", "doxycycline", "induction", "of", "either", "EV", ",", "SOX9", "-", "WT", "or", "SOX9", "-", "T236", "/", "240A", ".", "GAPDH", "was", "used", "as", "a", "loading", "control", "."
],
"panel_id": "12345",
"label_ids": {
"entity_types": [
"O", "O", "O", "O", "O", "O", "B-SMALL_MOLECULE", "O", "O", "O", "O", "O", "O", "O", "B-CELL", "O", "O", "O", "O", "B-CELL", "O", "O", "O", "O", "O", "B-CELL", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-GENEPROD", "O", "B-EXP_ASSAY", "I-EXP_ASSAY", "O", "O", "O", "O", "O", "O", "O", "B-SMALL_MOLECULE", "O", "O", "O", "O", "B-GENEPROD", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-SMALL_MOLECULE", "O", "O", "O", "O", "O", "O", "O", "O", "B-CELL", "O", "B-CELL", "O", "O", "O", "O", "O", "B-CELL", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-SMALL_MOLECULE", "O", "O", "O", "O", "B-GENEPROD", "O", "O", "O", "O", "O", "B-CELL", "O", "O", "O", "O", "B-CELL", "O", "O", "O", "O", "O", "O", "B-GENEPROD", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-EXP_ASSAY", "O", "O", "B-ORGANISM", "O", "B-CELL", "O", "B-CELL", "O", "O", "B-SMALL_MOLECULE", "O", "O", "B-GENEPROD", "O", "O", "B-SMALL_MOLECULE", "O", "O", "B-SMALL_MOLECULE", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-ORGANISM", "O", "O", "O", "B-GENEPROD", "B-ORGANISM", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-SMALL_MOLECULE", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-SMALL_MOLECULE", "O", "O", "B-CELL", "O", "B-GENEPROD", "O", "O", "O", "B-GENEPROD", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-GENEPROD", "O", "B-GENEPROD", "O", "O", "B-GENEPROD", "O", "O", "B-CELL", "O", "O", "O", "B-GENEPROD", "O", "O", "O", "B-GENEPROD", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-SMALL_MOLECULE", "O", "O", "O", "O", "O", "O", "O", "O", "B-EXP_ASSAY", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-GENEPROD", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-EXP_ASSAY", "I-EXP_ASSAY", "O", "B-GENEPROD", "O", "B-GENEPROD", "O", "B-GENEPROD", "O", "B-GENEPROD", "O", "B-GENEPROD", "I-GENEPROD", "I-GENEPROD", "O", "O", "O", "O", "O", "B-GENEPROD", "I-GENEPROD", "I-GENEPROD", "O", "B-CELL", "O", "O", "B-SMALL_MOLECULE", "O", "O", "O", "O", "O", "B-GENEPROD", "O", "O", "O", "B-GENEPROD", "O", "O", "O", "O", "O", "B-GENEPROD", "O", "O", "O", "O", "O", "O", "O"
],
"geneprod_roles": [
"O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-CONTROLLED_VAR", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-CONTROLLED_VAR", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-CONTROLLED_VAR", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-CONTROLLED_VAR", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-CONTROLLED_VAR", "O", "O", "O", "B-CONTROLLED_VAR", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-MEASURED_VAR", "O", "B-MEASURED_VAR", "O", "O", "B-MEASURED_VAR", "O", "O", "O", "O", "O", "O", "B-CONTROLLED_VAR", "O", "O", "O", "B-CONTROLLED_VAR", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-MEASURED_VAR", "O", "B-MEASURED_VAR", "O", "B-MEASURED_VAR", "O", "B-MEASURED_VAR", "I-MEASURED_VAR", "I-MEASURED_VAR", "O", "O", "O", "O", "O", "B-MEASURED_VAR", "I-MEASURED_VAR", "I-MEASURED_VAR", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-CONTROLLED_VAR", "O", "O", "O", "B-CONTROLLED_VAR", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O"
],
"boring": [
"O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-BORING", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-BORING", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-BORING", "O", "B-BORING", "O", "O", "B-BORING", "O", "O", "O", "O", "O", "O", "O", "O", "B-BORING", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-BORING", "O", "O", "O", "B-BORING", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-BORING", "O", "O", "B-BORING", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-BORING", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-BORING", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-BORING", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-BORING", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-BORING", "O", "O", "B-BORING", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-BORING", "O", "O", "O", "O", "O", "O", "O"
],
"panel_start": [
"O", "O", "O", "B-PANEL_START", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-PANEL_START", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-PANEL_START", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-PANEL_START", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-PANEL_START", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-PANEL_START", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O"
],
"small_mol_roles": ["O", "O", "O", "O", "O", "O", "B-CONTROLLED_VAR", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-CONTROLLED_VAR", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-CONTROLLED_VAR", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-CONTROLLED_VAR", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "B-CONTROLLED_VAR", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O"]
}
}
```
### Data Fields
- `words`: `list` of `strings` text tokenized into words.
- `panel_id`: ID of the panel to which the example belongs to in the SourceData database.
- `label_ids`:
- `entity_types`: `list` of `strings` for the IOB2 tags for entity type; possible value in `["O", "I-SMALL_MOLECULE", "B-SMALL_MOLECULE", "I-GENEPROD", "B-GENEPROD", "I-SUBCELLULAR", "B-SUBCELLULAR", "I-CELL", "B-CELL", "I-TISSUE", "B-TISSUE", "I-ORGANISM", "B-ORGANISM", "I-EXP_ASSAY", "B-EXP_ASSAY"]`
- `geneprod_roles`: `list` of `strings` for the IOB2 tags for experimental roles; values in `["O", "I-CONTROLLED_VAR", "B-CONTROLLED_VAR", "I-MEASURED_VAR", "B-MEASURED_VAR"]`
- `boring`: `list` of `strings` for IOB2 tags for entities unrelated to causal design; values in `["O", "I-BORING", "B-BORING"]`
- `panel_start`: `list` of `strings` for IOB2 tags `["O", "B-PANEL_START"]`
- `small_mol_roles`: `list` of `strings` for IOB2 tags showing whether the entity is the variable being measured or the control variable `["O", "B-CONTROLLED_VAR", "I-CONTROLLED_VAR", "B-MEASURED_VAR", "I-MEASURED_VAR",]`
### Data Splits
- train:
- features: ['words', 'labels', 'tag_mask', 'panel_id'],
- num_rows: 50_198
- validation:
- features: ['words', 'labels', 'tag_mask', 'panel_id'],
- num_rows: 5_946
- test:
- features: ['words', 'labels', 'tag_mask', 'panel_id'],
- num_rows: 6_222
## Dataset Creation
### Curation Rationale
The dataset was built to train models for the automatic extraction of a knowledge graph based from the scientific literature. The dataset can be used to train models for text segmentation, named entity recognition and semantic role labeling.
### Source Data
#### Initial Data Collection and Normalization
Figure legends were annotated according to the SourceData framework described in Liechti et al 2017 (Nature Methods, 2017, https://doi.org/10.1038/nmeth.4471). The curation tool at https://curation.sourcedata.io was used to segment figure legends into panel legends, tag enities, assign experiemental roles and normalize with standard identifiers (not available in this dataset). The source data was downloaded from the SourceData API (https://api.sourcedata.io) on 21 Jan 2021.
#### Who are the source language producers?
The examples are extracted from the figure legends from scientific papers in cell and molecular biology.
### Annotations
#### Annotation process
The annotations were produced manually with expert curators from the SourceData project (https://sourcedata.embo.org)
#### Who are the annotators?
Curators of the SourceData project.
### Personal and Sensitive Information
None known.
## Considerations for Using the Data
### Social Impact of Dataset
Not applicable.
### Discussion of Biases
The examples are heavily biased towards cell and molecular biology and are enriched in examples from papers published in EMBO Press journals (https://embopress.org)
The annotation of diseases has been added recently to the dataset. Although they appear, the number is very low and they are not consistently tagged through the entire dataset.
We recommend to use the diseases by filtering the examples that contain them.
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Thomas Lemberger, EMBO.
Jorge Abreu Vicente, EMBO
### Licensing Information
CC BY 4.0
### Citation Information
We are currently working on a paper to present the dataset. It is expected to be ready by 2023 spring. In the meantime, the following paper should be cited.
```latex
@article {Liechti2017,
author = {Liechti, Robin and George, Nancy and Götz, Lou and El-Gebali, Sara and Chasapi, Anastasia and Crespo, Isaac and Xenarios, Ioannis and Lemberger, Thomas},
title = {SourceData - a semantic platform for curating and searching figures},
year = {2017},
volume = {14},
number = {11},
doi = {10.1038/nmeth.4471},
URL = {https://doi.org/10.1038/nmeth.4471},
eprint = {https://www.biorxiv.org/content/early/2016/06/20/058529.full.pdf},
journal = {Nature Methods}
}
```
### Contributions
Thanks to [@tlemberger](https://github.com/tlemberger>) and [@drAbreu](https://github.com/drAbreu>) for adding this dataset.
|
tyzhu/fw_baseline_train_10000_eval_100 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval_find_word
path: data/eval_find_word-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1724070
num_examples: 10000
- name: eval_find_word
num_bytes: 17146
num_examples: 100
- name: validation
num_bytes: 17146
num_examples: 100
download_size: 849667
dataset_size: 1758362
---
# Dataset Card for "fw_baseline_train_10000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sajjadamjad/quiz_llm_tinyllama | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 344232.0
num_examples: 42
- name: test
num_bytes: 40980.0
num_examples: 5
download_size: 156805
dataset_size: 385212.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
RahmaSadder/test4 | ---
license: apache-2.0
---
|
Falah/chapter7_0_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2761
num_examples: 9
download_size: 3857
dataset_size: 2761
---
# Dataset Card for "chapter7_0_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
n-iv/sq | ---
license: openrail
task_categories:
- text-generation
language:
- sq
pretty_name: SQ
size_categories:
- 10M<n<100M
---
### Albanian dataset corput
It consists of 36M phrases/articles collected from the internet.
To cite:
```
@misc{https://doi.org/10.57967/hf/0324,
doi = {10.57967/HF/0324},
url = {https://huggingface.co/datasets/n-iv/sq},
author = {{Nullius in verba}},
title = {sq},
publisher = {Hugging Face},
year = {2023}
}
``` |
Sleoruiz/disc_cla_plenaria-2 | ---
dataset_info:
features:
- name: text
dtype: 'null'
- name: inputs
struct:
- name: comision
dtype: string
- name: fecha_gaceta
dtype: string
- name: gaceta_numero
dtype: string
- name: name
dtype: string
- name: text
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
sequence: string
- name: annotation_agent
dtype: string
- name: vectors
dtype: 'null'
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 162072571
num_examples: 42666
download_size: 65858974
dataset_size: 162072571
---
# Dataset Card for "disc_cla_plenaria-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JosueElias/pipeline_dataset2 | ---
dataset_info:
features:
- name: title
dtype: string
- name: section
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1522896529
num_examples: 2101279
download_size: 850821844
dataset_size: 1522896529
---
# Dataset Card for "pipeline_dataset2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jonathancsci/liberal-and-conservative-news | ---
license: cc0-1.0
---
# liberal-and-conservative-news
This dataset contains news articles from both liberal and conservative US news outlets. Most articles in this dataset were published between approximately March 2023 and March 2024, although some articles go back later.
If you want to get started immediately with training a text generation model, you can use the provided txt files, which have been cleaned and preprocessed for this task. If you want to do your own processing, this dataset provides csv files with the raw data. The csv files contain the url, headline and body for each article. Note that news outlets may take older articles off of their website. Therefore, depending on how long it's been since the article publication dates, some urls might not work.
The liberal.txt and conservative.txt files in this dataset were used to train [liberal-gpt2](https://huggingface.co/jonathancsci/liberal-gpt2) and [conservative-gpt2](https://huggingface.co/jonathancsci/conservative-gpt2) respectively, which are available on Hugging Face.
## Files
- liberal_news_articles.csv: 16,217 total articles from CNN, MSNBC and The New York Times. The schema includes the following columns: 'url', 'headline' and 'body'.
- liberal.txt: a text file that contains all the 'headline' and 'body' fields of liberal_news_articles.csv concatenated together. liberal.txt contains 13,840,860 total words.
- conservative_news_articles.csv: 26,063 total articles from FOX, The American Conservative and The Washington Times. The schema includes the following columns: 'url', 'headline' and 'body'.
- conservative.txt: a text file that contains all the 'headline' and 'body' fields of conservative_news_articles.csv concatenated together. conservative.txt contains 17,358,558 total words.
## Data Cleaning
When creating the txt files, repetitive strings that did not contribute to the content of the articles were removed. The following are a list of removed strings from each file.
- liberal.txt: `'CNN --\n', '\nCNN --\n', ' | CNN', ' | CNN Politics', ' | CNN Business'`
- conservative.txt: `'CLICK HERE TO GET THE FOX NEWS APP', 'CLICK TO GET THE FOX NEWS APP', 'CLICK HERE TO DOWNLOAD THE FOX NEWS APP', 'CLICK HERE TO GET THE FOX NEWS APP]', 'CLICK TO GET THE FOX NEWS APP]', 'GET THE FOX NEWS APP HERE', 'CLICK HERE FOR THE FOX NEWS APP', 'CLICK HERE FOR THE FOX NEWS APP]', 'CLICK HERE TO GET FOX NEWS APP', 'DOWNLOAD THE FOX NEWS APP HERE', 'DOWNLOAD THE FOX NEWS APP TODAY!', 'DOWNLOAD THE FOX NEWS APP HERE', 'CLICK HERE TO GER THE FOX NEWS APP', ': CLICK HERE TO GET THE FOX NEWS APP', 'CLICK TO GET THE FOX NEWS APPA', 'LICK HERE TO GET THE FOX NEWS APP', 'CLICK HERE TO GET THE FOX NEWS APPS', 'CLLICK HERE TO GET THE FOX NEWS APP', 'CLICK TO GET THE FOX NEWS APPCLICK TO GET THE FOX NEWS APP', 'CLICK HERE TO DOWNLOAD FOX NEWS APP', "CLICK HERE TO GET THE FOX NEWS APP'", 'CLICK HE RE TO GET THE FOX NEWS APP'`
## License
This dataset is placed in the public domain under the [`CC0-1.0`](https://creativecommons.org/publicdomain/zero/1.0/legalcode.en) license. |
PericlesSavio/resumo | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license: cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- summarization
- text2text-generation
- text-generation
task_ids: []
pretty_name: DIALOGSum Corpus
tags:
- dialogue-summary
- one-liner-summary
- meeting-title
- email-subject
---
# Dataset Card for DIALOGSum Corpus
## Dataset Description
### Links
- **Homepage:** https://aclanthology.org/2021.findings-acl.449
- **Repository:** https://github.com/cylnlp/dialogsum
- **Paper:** https://aclanthology.org/2021.findings-acl.449
- **Point of Contact:** https://huggingface.co/knkarthick
### Dataset Summary
DialogSum is a large-scale dialogue summarization dataset, consisting of 13,460 (Plus 100 holdout data for topic generation) dialogues with corresponding manually labeled summaries and topics.
### Languages
English
## Dataset Structure
### Data Instances
DialogSum is a large-scale dialogue summarization dataset, consisting of 13,460 dialogues (+1000 tests) split into train, test and validation.
The first instance in the training set:
{'id': 'train_0', 'summary': "Mr. Smith's getting a check-up, and Doctor Hawkins advises him to have one every year. Hawkins'll give some information about their classes and medications to help Mr. Smith quit smoking.", 'dialogue': "#Person1#: Hi, Mr. Smith. I'm Doctor Hawkins. Why are you here today?\n#Person2#: I found it would be a good idea to get a check-up.\n#Person1#: Yes, well, you haven't had one for 5 years. You should have one every year.\n#Person2#: I know. I figure as long as there is nothing wrong, why go see the doctor?\n#Person1#: Well, the best way to avoid serious illnesses is to find out about them early. So try to come at least once a year for your own good.\n#Person2#: Ok.\n#Person1#: Let me see here. Your eyes and ears look fine. Take a deep breath, please. Do you smoke, Mr. Smith?\n#Person2#: Yes.\n#Person1#: Smoking is the leading cause of lung cancer and heart disease, you know. You really should quit.\n#Person2#: I've tried hundreds of times, but I just can't seem to kick the habit.\n#Person1#: Well, we have classes and some medications that might help. I'll give you more information before you leave.\n#Person2#: Ok, thanks doctor.", 'topic': "get a check-up}
### Data Fields
- dialogue: text of dialogue.
- summary: human written summary of the dialogue.
- topic: human written topic/one liner of the dialogue.
- id: unique file id of an example.
### Data Splits
- train: 12460
- val: 500
- test: 1500
- holdout: 100 [Only 3 features: id, dialogue, topic]
## Dataset Creation
### Curation Rationale
In paper:
We collect dialogue data for DialogSum from three public dialogue corpora, namely Dailydialog (Li et al., 2017), DREAM (Sun et al., 2019) and MuTual (Cui et al., 2019), as well as an English speaking practice website. These datasets contain face-to-face spoken dialogues that cover a wide range of daily-life topics, including schooling, work, medication, shopping, leisure, travel. Most conversations take place between friends, colleagues, and between service providers and customers.
Compared with previous datasets, dialogues from DialogSum have distinct characteristics:
Under rich real-life scenarios, including more diverse task-oriented scenarios;
Have clear communication patterns and intents, which is valuable to serve as summarization sources;
Have a reasonable length, which comforts the purpose of automatic summarization.
We ask annotators to summarize each dialogue based on the following criteria:
Convey the most salient information;
Be brief;
Preserve important named entities within the conversation;
Be written from an observer perspective;
Be written in formal language.
### Who are the source language producers?
linguists
### Who are the annotators?
language experts
## Licensing Information
CC BY-NC-SA 4.0
## Citation Information
```
@inproceedings{chen-etal-2021-dialogsum,
title = "{D}ialog{S}um: {A} Real-Life Scenario Dialogue Summarization Dataset",
author = "Chen, Yulong and
Liu, Yang and
Chen, Liang and
Zhang, Yue",
booktitle = "Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.findings-acl.449",
doi = "10.18653/v1/2021.findings-acl.449",
pages = "5062--5074",
```
## Contributions
Thanks to [@cylnlp](https://github.com/cylnlp) for adding this dataset. |
autoevaluate/autoeval-staging-eval-squad_v2-squad_v2-c76793-16626243 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: 21iridescent/distilbert-base-uncased-finetuned-squad
metrics: []
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: 21iridescent/distilbert-base-uncased-finetuned-squad
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
AlekseyKorshuk/updated-responses-preview | ---
dataset_info:
features:
- name: message_id
dtype: string
- name: input_text
dtype: string
- name: response
dtype: string
- name: old_output_text
dtype: string
- name: user_id
dtype: string
- name: output_text
dtype: string
splits:
- name: train
num_bytes: 43326669
num_examples: 17407
download_size: 20805678
dataset_size: 43326669
---
# Dataset Card for "updated-responses-preview"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NeuroDragon/BuggedPythonLeetCode | ---
license: apache-2.0
task_categories:
- text-generation
- question-answering
language:
- en
tags:
- code
size_categories:
- 10K<n<100K
---
# Dataset Description
edit: fixed some bugs with datasets not handling all pyarrow types.
## Dataset Summary
This dataset consists of Python coding problems from LeetCode, which have been bugged using the [OpenBugger](https://github.com/furlat/OpenBugger) package. This dataset provides a unique opportunity to study the debugging process in a controlled and replicable environment.
For each correct code snippet, 15 bugged versions were attempted. For each succesfully bugged version, a corresponding question mimicking a beginner coder's perspective was generated, creating a Q/A pair. In addition to the code and question, each data entry contains the task description, the bug's location, and debugging instructions.
Finally the code snippets are wrapped in python markdown headers and the conversation is structured using the [ChatML](https://github.com/openai/openai-python/blob/main/chatml.md) format.
Highest quality data for training LLM are found in https://huggingface.co/datasets/NeuroDragon/BuggedPythonLeetCode/blob/main/train/bugged_leetcode_no_replaced.parquet
## Supported Tasks
This dataset supports a variety of tasks:
- Code Debugging: Predict the correct code snippet given the bugged code and the question.
- Question Answering: Given the task description and bugged code, generate the question.
- Code Generation: Given the task description, question, and debugging instructions, generate the correct code.
- CST Generation: Given the code, generate the concrete syntax tree.
## Languages
The text in the dataset is in English, and the code is in Python.
# Dataset Structure
## Data Instances
The core of the dataset is constructed around the following concepts, refer to the dataframes columns headers for the specific names:
- correct_code: The original, correct Python code snippet.
- task_description: A brief description of the coding task.
- bugged_code: The bugged version of the correct code.
- bugs_location: The location(s) of the bug(s) in the code.
- debugging_instructions: Instructions to help debug the code.
- question: A question related to the bugged code, mimicking a beginner's query.
- answer: The answer to the question, which alawys contain the original correct code or a chunk of GPT-generated code that matches the original up to linting and comments.
## Data Splits
The dataset is split into five files:
- bugged_leetcode_all_conversations_with_embeddings.parquet
- bugged_leetcode_all_conversations.parquet
- bugged_leetcode_no_replaced_with_embeddings.parquet
- bugged_leetcode_no_replaced.parquet
- bugged_leetcode_all_steps.parquet
## Data Generation Process
The data was generated using a combination of LeetCode problem data, the OpenBugger package, and the GPT 3.5 model. The original code was bugged using OpenBugger, and then the GPT model was used to generate a question and answer based on the bugged code and task description in order to limit GPT contribution to the natural language and not the coding aspect of the dataset. Additional processing ensured that the final answer was a compilable Python code and that corresponded to the original leetcode solution.
# Dataset Creation
## Curation Rationale
This dataset was curated to provide a large-scale, diverse set of Python programming problems and their bugged versions, which could be used for developing and evaluating models for debugging, code generation, and question answering.
## Dataset Source
The original coding problems were sourced from [leetcode-solutions-python](https://huggingface.co/datasets/mhhmm/leetcode-solutions-python
).
## Licensing Information
Please refer to the licensing information of the original dataset.
# Dataset Usage
## Usage Caveats
Users should be aware that the questions in this dataset contain some stereotypical phrases, and may benefit from checking for n-gram distributions and filtering the spikes. Multiple post-processing steps have already been applied, but better safe than sorry.
# Dataset Maintenance
## Contact Information
Please contact the [original author](https://github.com/furlat) for any questions or concerns related to this dataset.
## Dataset Updates
This is a static dataset that does not receive updates. |
Eternity-ai/htlm-0-1 | ---
license: cc-by-nc-4.0
---
|
open-llm-leaderboard/details_Weyaxi__OpenOrca-Nebula-7B | ---
pretty_name: Evaluation run of Weyaxi/OpenOrca-Nebula-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/OpenOrca-Nebula-7B](https://huggingface.co/Weyaxi/OpenOrca-Nebula-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__OpenOrca-Nebula-7B_public\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-11-08T11:58:02.317093](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__OpenOrca-Nebula-7B_public/blob/main/results_2023-11-08T11-58-02.317093.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5781344309327976,\n\
\ \"acc_stderr\": 0.03435050067075012,\n \"acc_norm\": 0.581933273042423,\n\
\ \"acc_norm_stderr\": 0.03433158518593753,\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.531795789007015,\n\
\ \"mc2_stderr\": 0.015539765760842488\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.552901023890785,\n \"acc_stderr\": 0.014529380160526848,\n\
\ \"acc_norm\": 0.5870307167235495,\n \"acc_norm_stderr\": 0.014388344935398326\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6283608842859988,\n\
\ \"acc_stderr\": 0.004822550638450896,\n \"acc_norm\": 0.8183628759211312,\n\
\ \"acc_norm_stderr\": 0.0038475722596364257\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.03988903703336284,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.03988903703336284\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7129032258064516,\n\
\ \"acc_stderr\": 0.025736542745594528,\n \"acc_norm\": 0.7129032258064516,\n\
\ \"acc_norm_stderr\": 0.025736542745594528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.0352439084451178,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.0352439084451178\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\"\
: 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.029778663037752954,\n\
\ \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.029778663037752954\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.025049197876042345,\n\
\ \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.025049197876042345\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.032219436365661956,\n\
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.032219436365661956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7798165137614679,\n \"acc_stderr\": 0.017765978652327562,\n \"\
acc_norm\": 0.7798165137614679,\n \"acc_norm_stderr\": 0.017765978652327562\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.03058759135160425,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.03058759135160425\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.04010358942462203,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.04010358942462203\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514511,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514511\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\
\ \"acc_stderr\": 0.046166311118017125,\n \"acc_norm\": 0.6481481481481481,\n\
\ \"acc_norm_stderr\": 0.046166311118017125\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\
\ \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.8162393162393162,\n\
\ \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n\
\ \"acc_stderr\": 0.014957458504335833,\n \"acc_norm\": 0.7739463601532567,\n\
\ \"acc_norm_stderr\": 0.014957458504335833\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.026189666966272035,\n\
\ \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.026189666966272035\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3553072625698324,\n\
\ \"acc_stderr\": 0.01600698993480319,\n \"acc_norm\": 0.3553072625698324,\n\
\ \"acc_norm_stderr\": 0.01600698993480319\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.02763417668960266,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.02763417668960266\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.02715520810320086,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.02715520810320086\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4302477183833116,\n\
\ \"acc_stderr\": 0.012645361435115233,\n \"acc_norm\": 0.4302477183833116,\n\
\ \"acc_norm_stderr\": 0.012645361435115233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.03034326422421352,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.03034326422421352\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5980392156862745,\n \"acc_stderr\": 0.01983517648437539,\n \
\ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.01983517648437539\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5265306122448979,\n \"acc_stderr\": 0.03196412734523272,\n\
\ \"acc_norm\": 0.5265306122448979,\n \"acc_norm_stderr\": 0.03196412734523272\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.029705284056772432,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.029705284056772432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.032744852119469564,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.032744852119469564\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.531795789007015,\n\
\ \"mc2_stderr\": 0.015539765760842488\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/OpenOrca-Nebula-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|arc:challenge|25_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hellaswag|10_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-08T11-58-02.317093.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-08T11-58-02.317093.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-08T11-58-02.317093.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-08T11-58-02.317093.parquet'
- config_name: results
data_files:
- split: 2023_11_08T11_58_02.317093
path:
- results_2023-11-08T11-58-02.317093.parquet
- split: latest
path:
- results_2023-11-08T11-58-02.317093.parquet
---
# Dataset Card for Evaluation run of Weyaxi/OpenOrca-Nebula-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/OpenOrca-Nebula-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/OpenOrca-Nebula-7B](https://huggingface.co/Weyaxi/OpenOrca-Nebula-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__OpenOrca-Nebula-7B_public",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-08T11:58:02.317093](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__OpenOrca-Nebula-7B_public/blob/main/results_2023-11-08T11-58-02.317093.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5781344309327976,
"acc_stderr": 0.03435050067075012,
"acc_norm": 0.581933273042423,
"acc_norm_stderr": 0.03433158518593753,
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.531795789007015,
"mc2_stderr": 0.015539765760842488
},
"harness|arc:challenge|25": {
"acc": 0.552901023890785,
"acc_stderr": 0.014529380160526848,
"acc_norm": 0.5870307167235495,
"acc_norm_stderr": 0.014388344935398326
},
"harness|hellaswag|10": {
"acc": 0.6283608842859988,
"acc_stderr": 0.004822550638450896,
"acc_norm": 0.8183628759211312,
"acc_norm_stderr": 0.0038475722596364257
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.03988903703336284,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.03988903703336284
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7129032258064516,
"acc_stderr": 0.025736542745594528,
"acc_norm": 0.7129032258064516,
"acc_norm_stderr": 0.025736542745594528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.0352439084451178,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.0352439084451178
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7823834196891192,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.7823834196891192,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.025049197876042345,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.025049197876042345
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.032219436365661956,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.032219436365661956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7798165137614679,
"acc_stderr": 0.017765978652327562,
"acc_norm": 0.7798165137614679,
"acc_norm_stderr": 0.017765978652327562
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514511,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514511
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.046166311118017125,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.046166311118017125
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.025372139671722933,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.025372139671722933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7739463601532567,
"acc_stderr": 0.014957458504335833,
"acc_norm": 0.7739463601532567,
"acc_norm_stderr": 0.014957458504335833
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.615606936416185,
"acc_stderr": 0.026189666966272035,
"acc_norm": 0.615606936416185,
"acc_norm_stderr": 0.026189666966272035
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3553072625698324,
"acc_stderr": 0.01600698993480319,
"acc_norm": 0.3553072625698324,
"acc_norm_stderr": 0.01600698993480319
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.02763417668960266,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.02763417668960266
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.02715520810320086,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.02715520810320086
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.02604176620271716,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.02604176620271716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4302477183833116,
"acc_stderr": 0.012645361435115233,
"acc_norm": 0.4302477183833116,
"acc_norm_stderr": 0.012645361435115233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.03034326422421352,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.03034326422421352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.01983517648437539,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.01983517648437539
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5265306122448979,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.5265306122448979,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772432,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772432
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.032744852119469564,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.032744852119469564
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.531795789007015,
"mc2_stderr": 0.015539765760842488
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
anti-ai/ViNLI-SimCSE-supervised | ---
language:
- vi
license: apache-2.0
task_categories:
- sentence-similarity
size_categories:
- 100K<n<1M
--- |
Alienmaster/wikipedia_leipzig_de_2016 | ---
language:
- de
multilinguality:
- monolingual
license: cc-by-sa-4.0
size_categories:
- 100K<n<1M
task_categories:
- text-classification
pretty_name: Leipzig Corpora Wikipedia 2016 German
configs:
- config_name: default
data_files:
- split: 10k
path: "10k.parquet"
- split: 30k
path: "30k.parquet"
- split: 100k
path: "100k.parquet"
- split: 1mio
path: "1mio.parquet"
---
## Leipzig Corpora Wikipedia 2016 German
This dataset contains different splits (between 10k and 1mio) from the german wikipedia 2016. The data were collected 2016.
Every element in the dataset is labeled as "neutral".
The source can be found [here](https://wortschatz.uni-leipzig.de/de/download/German)
## Citation
```
@inproceedings{goldhahn-etal-2012-building,
title = "Building Large Monolingual Dictionaries at the {L}eipzig Corpora Collection: From 100 to 200 Languages",
author = "Goldhahn, Dirk and
Eckart, Thomas and
Quasthoff, Uwe",
editor = "Calzolari, Nicoletta and
Choukri, Khalid and
Declerck, Thierry and
Do{\u{g}}an, Mehmet U{\u{g}}ur and
Maegaard, Bente and
Mariani, Joseph and
Moreno, Asuncion and
Odijk, Jan and
Piperidis, Stelios",
booktitle = "Proceedings of the Eighth International Conference on Language Resources and Evaluation ({LREC}'12)",
month = may,
year = "2012",
address = "Istanbul, Turkey",
publisher = "European Language Resources Association (ELRA)",
url = "http://www.lrec-conf.org/proceedings/lrec2012/pdf/327_Paper.pdf",
pages = "759--765",
abstract = "The Leipzig Corpora Collection offers free online access to 136 monolingual dictionaries enriched with statistical information. In this paper we describe current advances of the project in collecting and processing text data automatically for a large number of languages. Our main interest lies in languages of low density, where only few text data exists online. The aim of this approach is to create monolingual dictionaries and statistical information for a high number of new languages and to expand the existing dictionaries, opening up new possibilities for linguistic typology and other research. Focus of this paper will be set on the infrastructure for the automatic acquisition of large amounts of monolingual text in many languages from various sources. Preliminary results of the collection of text data will be presented. The mainly language-independent framework for preprocessing, cleaning and creating the corpora and computing the necessary statistics will also be depicted.",
}
``` |
one-sec-cv12/chunk_73 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 24688658160.375
num_examples: 257045
download_size: 22518351650
dataset_size: 24688658160.375
---
# Dataset Card for "chunk_73"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rcp-meetings/rudialogsum_v2 | ---
license: mit
task_categories:
- text2text-generation
- summarization
language:
- ru
size_categories:
- 10K<n<100K
---
Датасет dialogsum переведенный на русский язык. Глюки перевода устранены автоматической чисткой |
Nexdata/4601_Images_22_Kinds_of_Bills_OCR_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
4,601 Images-22 Kinds of Bills OCR Data. The data background is pure color. The data covers 22 kinds of bills of multiple provinces. In terms of annotation, line-level quadrilateral bounding box annotation, line-level transcription for the texts were annotated in the data. The data can be used for tasks such as OCR for bills.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1028?source=Huggingface
# Specifications
## Data size
4,601 images, 22 kinds
## Collection environment
pure color background
## Data diversity
including multiple types of bills, multiple provinces
## Device
cellphone
## Image Parameter
the image data is in .jpg format, the annotation file is in .json format
## Annotation content
line-level quadrilateral bounding box annotation, line-level transcription for the texts
## Accuracy
the error bound of each vertex of quadrilateral bounding box is within 5 pixels, which is a qualified
# Licensing Information
Commercial License
|
mcimpoi/dtd_split_1 | ---
license: cc-by-4.0
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': banded
'1': blotchy
'2': braided
'3': bubbly
'4': bumpy
'5': chequered
'6': cobwebbed
'7': cracked
'8': crosshatched
'9': crystalline
'10': dotted
'11': fibrous
'12': flecked
'13': freckled
'14': frilly
'15': gauzy
'16': grid
'17': grooved
'18': honeycombed
'19': interlaced
'20': knitted
'21': lacelike
'22': lined
'23': marbled
'24': matted
'25': meshed
'26': paisley
'27': perforated
'28': pitted
'29': pleated
'30': polka-dotted
'31': porous
'32': potholed
'33': scaly
'34': smeared
'35': spiralled
'36': sprinkled
'37': stained
'38': stratified
'39': striped
'40': studded
'41': swirly
'42': veined
'43': waffled
'44': woven
'45': wrinkled
'46': zigzagged
splits:
- name: train
num_bytes: 226313270.04
num_examples: 1880
- name: test
num_bytes: 172035822
num_examples: 1880
- name: validation
num_bytes: 222278767.48
num_examples: 1880
download_size: 629315160
dataset_size: 620627859.52
task_categories:
- image-classification
language:
- en
tags:
- texture
- computer-vision
pretty_name: Describable Textures Dataset
size_categories:
- 1K<n<10K
---
# Dataset Card for Describable Textures Dataset (DTD)
## Dataset Description
- Homepage: https://www.robots.ox.ac.uk/~vgg/data/dtd/
- Repository: https://github.com/mcimpoi/deep-fbanks
- Paper: https://openaccess.thecvf.com/content_cvpr_2014/html/Cimpoi_Describing_Textures_in_2014_CVPR_paper.html
- Leaderboard: https://paperswithcode.com/sota/image-classification-on-dtd
### Dataset Summary
Texture classification dataset; consists of 47 categories, 120 images per class.
### Data Splits
Equally split into train, val, test; The original paper proposed 10 splits; recent works (BYOL, arxiv:2006.07733) use only first split.
### Licensing Information
Not defined at https://www.robots.ox.ac.uk/~vgg/data/dtd/
### Citation Information
@InProceedings{cimpoi14describing,
Author = {M. Cimpoi and S. Maji and I. Kokkinos and S. Mohamed and and A. Vedaldi},
Title = {Describing Textures in the Wild},
Booktitle = {Proceedings of the {IEEE} Conf. on Computer Vision and Pattern Recognition ({CVPR})},
Year = {2014}}
|
metredo085/tania | ---
license: apache-2.0
---
|
nguyenminh871/titan_0_5_1 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: func
dtype: string
- name: target
dtype: bool
- name: project
dtype: string
splits:
- name: titan_0_5_1
num_bytes: 4760562
num_examples: 1770
download_size: 1279691
dataset_size: 4760562
---
# Dataset Card for "titan_0_5_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_12_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 948
num_examples: 32
download_size: 1915
dataset_size: 948
---
# Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_12_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
declare-lab/audio-alpaca | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: audio
- name: rejected
dtype: audio
- name: strategy
dtype: string
splits:
- name: train
num_bytes: 9851286989.75
num_examples: 15025
download_size: 9708866178
dataset_size: 9851286989.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
language:
- en
pretty_name: Audio-alpaca
size_categories:
- 10K<n<100K
---
# Audio-alpaca: A preference dataset for aligning text-to-audio models
Audio-alpaca is a pairwise preference dataset containing about 15k (prompt,chosen, rejected) triplets where given a textual prompt, **chosen** is the preferred generated audio and **rejected** is the undesirable audio.
## Field details
**prompt**: Given textual prompt
**chosen**: The preferred audio sample
**rejected**: The rejected audio sample |
open-llm-leaderboard/details_eren23__Experiment26-12B | ---
pretty_name: Evaluation run of eren23/Experiment26-12B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [eren23/Experiment26-12B](https://huggingface.co/eren23/Experiment26-12B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eren23__Experiment26-12B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-13T12:49:09.388382](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__Experiment26-12B/blob/main/results_2024-03-13T12-49-09.388382.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6402681032299704,\n\
\ \"acc_stderr\": 0.03257871602190505,\n \"acc_norm\": 0.6425937133842362,\n\
\ \"acc_norm_stderr\": 0.033246297990057946,\n \"mc1\": 0.5581395348837209,\n\
\ \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.7212247872838202,\n\
\ \"mc2_stderr\": 0.014761691292219955\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6697952218430034,\n \"acc_stderr\": 0.013743085603760427,\n\
\ \"acc_norm\": 0.6885665529010239,\n \"acc_norm_stderr\": 0.013532472099850945\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7141007767377017,\n\
\ \"acc_stderr\": 0.0045091819193228445,\n \"acc_norm\": 0.8858793069109739,\n\
\ \"acc_norm_stderr\": 0.0031730798074401816\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901409,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901409\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395269,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395269\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.02555992055053101,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.02555992055053101\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5476190476190477,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.5476190476190477,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633507,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633507\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251735,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251735\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n\
\ \"acc_stderr\": 0.029918586707798827,\n \"acc_norm\": 0.726457399103139,\n\
\ \"acc_norm_stderr\": 0.029918586707798827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.0384985609879409,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.0384985609879409\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.014143970276657578,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.014143970276657578\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4134078212290503,\n\
\ \"acc_stderr\": 0.01646981492840617,\n \"acc_norm\": 0.4134078212290503,\n\
\ \"acc_norm_stderr\": 0.01646981492840617\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"\
acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48239895697522817,\n\
\ \"acc_stderr\": 0.012762321298823643,\n \"acc_norm\": 0.48239895697522817,\n\
\ \"acc_norm_stderr\": 0.012762321298823643\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5581395348837209,\n\
\ \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.7212247872838202,\n\
\ \"mc2_stderr\": 0.014761691292219955\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370656\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.49962092494313876,\n \
\ \"acc_stderr\": 0.013772480761626172\n }\n}\n```"
repo_url: https://huggingface.co/eren23/Experiment26-12B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|arc:challenge|25_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|gsm8k|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hellaswag|10_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T12-49-09.388382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T12-49-09.388382.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- '**/details_harness|winogrande|5_2024-03-13T12-49-09.388382.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-13T12-49-09.388382.parquet'
- config_name: results
data_files:
- split: 2024_03_13T12_49_09.388382
path:
- results_2024-03-13T12-49-09.388382.parquet
- split: latest
path:
- results_2024-03-13T12-49-09.388382.parquet
---
# Dataset Card for Evaluation run of eren23/Experiment26-12B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [eren23/Experiment26-12B](https://huggingface.co/eren23/Experiment26-12B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eren23__Experiment26-12B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-13T12:49:09.388382](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__Experiment26-12B/blob/main/results_2024-03-13T12-49-09.388382.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6402681032299704,
"acc_stderr": 0.03257871602190505,
"acc_norm": 0.6425937133842362,
"acc_norm_stderr": 0.033246297990057946,
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.7212247872838202,
"mc2_stderr": 0.014761691292219955
},
"harness|arc:challenge|25": {
"acc": 0.6697952218430034,
"acc_stderr": 0.013743085603760427,
"acc_norm": 0.6885665529010239,
"acc_norm_stderr": 0.013532472099850945
},
"harness|hellaswag|10": {
"acc": 0.7141007767377017,
"acc_stderr": 0.0045091819193228445,
"acc_norm": 0.8858793069109739,
"acc_norm_stderr": 0.0031730798074401816
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901409,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901409
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.02555992055053101,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.02555992055053101
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5476190476190477,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.5476190476190477,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633507,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633507
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652457,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652457
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251735,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251735
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.726457399103139,
"acc_stderr": 0.029918586707798827,
"acc_norm": 0.726457399103139,
"acc_norm_stderr": 0.029918586707798827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.0384985609879409,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.0384985609879409
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657578,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657578
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4134078212290503,
"acc_stderr": 0.01646981492840617,
"acc_norm": 0.4134078212290503,
"acc_norm_stderr": 0.01646981492840617
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48239895697522817,
"acc_stderr": 0.012762321298823643,
"acc_norm": 0.48239895697522817,
"acc_norm_stderr": 0.012762321298823643
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0190709855896875,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0190709855896875
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252089,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252089
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.7212247872838202,
"mc2_stderr": 0.014761691292219955
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.010450899545370656
},
"harness|gsm8k|5": {
"acc": 0.49962092494313876,
"acc_stderr": 0.013772480761626172
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DjSteker/yelp_review_full1 | ---
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': 1 star
'1': 2 star
'2': 3 stars
'3': 4 stars
'4': 5 stars
- name: text
dtype: string
splits:
- name: train
num_bytes: 483811554
num_examples: 650000
- name: test
num_bytes: 37271188
num_examples: 50000
download_size: 322952369
dataset_size: 521082742
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
HuggingFaceM4/cm4_valid-Sample | Invalid username or password. |
loubnabnl/humaneval_infilling | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- code
license:
- mit
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text2text-generation
task_ids: []
pretty_name: OpenAI HumanEval-Infilling
tags:
- code-generation
---
# HumanEval-Infilling
## Dataset Description
- **Repository:** https://github.com/openai/human-eval-infilling
- **Paper:** https://arxiv.org/pdf/2207.14255
## Dataset Summary
[HumanEval-Infilling](https://github.com/openai/human-eval-infilling) is a benchmark for infilling tasks, derived from [HumanEval](https://huggingface.co/datasets/openai_humaneval) benchmark for the evaluation of code generation models.
## Dataset Structure
To load the dataset you need to specify a subset. By default `HumanEval-SingleLineInfilling` is loaded.
```python
from datasets import load_dataset
ds = load_dataset("humaneval_infilling", "HumanEval-RandomSpanInfilling")
DatasetDict({
test: Dataset({
features: ['task_id', 'entry_point', 'prompt', 'suffix', 'canonical_solution', 'test'],
num_rows: 1640
})
})
```
## Subsets
This dataset has 4 subsets: HumanEval-MultiLineInfilling, HumanEval-SingleLineInfilling, HumanEval-RandomSpanInfilling, HumanEval-RandomSpanInfillingLight.
The single-line, multi-line, random span infilling and its light version have 1033, 5815, 1640 and 164 tasks, respectively.
## Citation
```
@article{bavarian2022efficient,
title={Efficient Training of Language Models to Fill in the Middle},
author={Bavarian, Mohammad and Jun, Heewoo and Tezak, Nikolas and Schulman, John and McLeavey, Christine and Tworek, Jerry and Chen, Mark},
journal={arXiv preprint arXiv:2207.14255},
year={2022}
}
``` |
danielwasewicz/qc | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: code_snippet
dtype: string
splits:
- name: train
num_bytes: 131717
num_examples: 32
download_size: 62683
dataset_size: 131717
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
OussamaFajrE/GherkinSyntax | ---
license: mit
size_categories:
- n<1K
--- |
open-llm-leaderboard/details_amu__orpo-phi2 | ---
pretty_name: Evaluation run of amu/orpo-phi2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [amu/orpo-phi2](https://huggingface.co/amu/orpo-phi2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_amu__orpo-phi2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T18:28:59.918481](https://huggingface.co/datasets/open-llm-leaderboard/details_amu__orpo-phi2/blob/main/results_2024-04-02T18-28-59.918481.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.28937443946253527,\n\
\ \"acc_stderr\": 0.03214011600761226,\n \"acc_norm\": 0.2915287325390915,\n\
\ \"acc_norm_stderr\": 0.03300223365820327,\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237024,\n \"mc2\": 0.4761965022767635,\n\
\ \"mc2_stderr\": 0.01637823785885922\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.28668941979522183,\n \"acc_stderr\": 0.013214986329274757,\n\
\ \"acc_norm\": 0.3122866894197952,\n \"acc_norm_stderr\": 0.013542598541688065\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.33359888468432586,\n\
\ \"acc_stderr\": 0.004705347137699603,\n \"acc_norm\": 0.4151563433578968,\n\
\ \"acc_norm_stderr\": 0.004917419367766031\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.362962962962963,\n\
\ \"acc_stderr\": 0.041539484047424,\n \"acc_norm\": 0.362962962962963,\n\
\ \"acc_norm_stderr\": 0.041539484047424\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.28679245283018867,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.28679245283018867,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n\
\ \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.2658959537572254,\n\
\ \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993177,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231004,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525214,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333337,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333337\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.29354838709677417,\n \"acc_stderr\": 0.02590608702131929,\n \"\
acc_norm\": 0.29354838709677417,\n \"acc_norm_stderr\": 0.02590608702131929\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n \"\
acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.036810508691615514,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.036810508691615514\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.30303030303030304,\n \"acc_stderr\": 0.03274287914026867,\n \"\
acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.03274287914026867\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565319,\n\
\ \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565319\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.32564102564102565,\n \"acc_stderr\": 0.02375966576741229,\n\
\ \"acc_norm\": 0.32564102564102565,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275784,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275784\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.29831932773109243,\n \"acc_stderr\": 0.029719142876342863,\n\
\ \"acc_norm\": 0.29831932773109243,\n \"acc_norm_stderr\": 0.029719142876342863\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3302752293577982,\n \"acc_stderr\": 0.02016446633634298,\n \"\
acc_norm\": 0.3302752293577982,\n \"acc_norm_stderr\": 0.02016446633634298\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647207,\n \"\
acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647207\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.29411764705882354,\n \"acc_stderr\": 0.03198001660115071,\n \"\
acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.03198001660115071\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3037974683544304,\n \"acc_stderr\": 0.029936696387138598,\n \
\ \"acc_norm\": 0.3037974683544304,\n \"acc_norm_stderr\": 0.029936696387138598\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.24663677130044842,\n\
\ \"acc_stderr\": 0.028930413120910888,\n \"acc_norm\": 0.24663677130044842,\n\
\ \"acc_norm_stderr\": 0.028930413120910888\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591203,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591203\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.32515337423312884,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.32515337423312884,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.32038834951456313,\n \"acc_stderr\": 0.046202840822800406,\n\
\ \"acc_norm\": 0.32038834951456313,\n \"acc_norm_stderr\": 0.046202840822800406\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3247863247863248,\n\
\ \"acc_stderr\": 0.030679022765498835,\n \"acc_norm\": 0.3247863247863248,\n\
\ \"acc_norm_stderr\": 0.030679022765498835\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456344,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456344\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3103448275862069,\n\
\ \"acc_stderr\": 0.016543785026048315,\n \"acc_norm\": 0.3103448275862069,\n\
\ \"acc_norm_stderr\": 0.016543785026048315\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.023532925431044273,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.023532925431044273\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.014444157808261426,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.014444157808261426\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.025829163272757485,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.025829163272757485\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3279742765273312,\n\
\ \"acc_stderr\": 0.026664410886937606,\n \"acc_norm\": 0.3279742765273312,\n\
\ \"acc_norm_stderr\": 0.026664410886937606\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.023788583551658533,\n\
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.023788583551658533\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24468085106382978,\n \"acc_stderr\": 0.025645553622266722,\n \
\ \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.025645553622266722\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2529335071707953,\n\
\ \"acc_stderr\": 0.011102268713839987,\n \"acc_norm\": 0.2529335071707953,\n\
\ \"acc_norm_stderr\": 0.011102268713839987\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.026799562024887674,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.026799562024887674\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320657,\n \
\ \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.04122066502878285,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.04122066502878285\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19183673469387755,\n \"acc_stderr\": 0.02520696315422538,\n\
\ \"acc_norm\": 0.19183673469387755,\n \"acc_norm_stderr\": 0.02520696315422538\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2885572139303483,\n\
\ \"acc_stderr\": 0.0320384104021332,\n \"acc_norm\": 0.2885572139303483,\n\
\ \"acc_norm_stderr\": 0.0320384104021332\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.26506024096385544,\n\
\ \"acc_stderr\": 0.03436024037944967,\n \"acc_norm\": 0.26506024096385544,\n\
\ \"acc_norm_stderr\": 0.03436024037944967\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n\
\ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237024,\n \"mc2\": 0.4761965022767635,\n\
\ \"mc2_stderr\": 0.01637823785885922\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5588003157063931,\n \"acc_stderr\": 0.013954975072834724\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/amu/orpo-phi2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|arc:challenge|25_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|gsm8k|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hellaswag|10_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-28-59.918481.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T18-28-59.918481.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- '**/details_harness|winogrande|5_2024-04-02T18-28-59.918481.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T18-28-59.918481.parquet'
- config_name: results
data_files:
- split: 2024_04_02T18_28_59.918481
path:
- results_2024-04-02T18-28-59.918481.parquet
- split: latest
path:
- results_2024-04-02T18-28-59.918481.parquet
---
# Dataset Card for Evaluation run of amu/orpo-phi2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [amu/orpo-phi2](https://huggingface.co/amu/orpo-phi2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_amu__orpo-phi2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T18:28:59.918481](https://huggingface.co/datasets/open-llm-leaderboard/details_amu__orpo-phi2/blob/main/results_2024-04-02T18-28-59.918481.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.28937443946253527,
"acc_stderr": 0.03214011600761226,
"acc_norm": 0.2915287325390915,
"acc_norm_stderr": 0.03300223365820327,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237024,
"mc2": 0.4761965022767635,
"mc2_stderr": 0.01637823785885922
},
"harness|arc:challenge|25": {
"acc": 0.28668941979522183,
"acc_stderr": 0.013214986329274757,
"acc_norm": 0.3122866894197952,
"acc_norm_stderr": 0.013542598541688065
},
"harness|hellaswag|10": {
"acc": 0.33359888468432586,
"acc_stderr": 0.004705347137699603,
"acc_norm": 0.4151563433578968,
"acc_norm_stderr": 0.004917419367766031
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.041539484047424,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.041539484047424
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.28679245283018867,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.28679245283018867,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993177,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525214,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333337,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333337
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.29354838709677417,
"acc_stderr": 0.02590608702131929,
"acc_norm": 0.29354838709677417,
"acc_norm_stderr": 0.02590608702131929
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733552,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733552
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.036810508691615514,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.036810508691615514
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.03274287914026867,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.03274287914026867
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2538860103626943,
"acc_stderr": 0.03141024780565319,
"acc_norm": 0.2538860103626943,
"acc_norm_stderr": 0.03141024780565319
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.32564102564102565,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.32564102564102565,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275784,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275784
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29831932773109243,
"acc_stderr": 0.029719142876342863,
"acc_norm": 0.29831932773109243,
"acc_norm_stderr": 0.029719142876342863
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3302752293577982,
"acc_stderr": 0.02016446633634298,
"acc_norm": 0.3302752293577982,
"acc_norm_stderr": 0.02016446633634298
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.36574074074074076,
"acc_stderr": 0.03284738857647207,
"acc_norm": 0.36574074074074076,
"acc_norm_stderr": 0.03284738857647207
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.03198001660115071,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.03198001660115071
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3037974683544304,
"acc_stderr": 0.029936696387138598,
"acc_norm": 0.3037974683544304,
"acc_norm_stderr": 0.029936696387138598
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.24663677130044842,
"acc_stderr": 0.028930413120910888,
"acc_norm": 0.24663677130044842,
"acc_norm_stderr": 0.028930413120910888
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591203,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591203
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.32515337423312884,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.32515337423312884,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.32038834951456313,
"acc_stderr": 0.046202840822800406,
"acc_norm": 0.32038834951456313,
"acc_norm_stderr": 0.046202840822800406
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3247863247863248,
"acc_stderr": 0.030679022765498835,
"acc_norm": 0.3247863247863248,
"acc_norm_stderr": 0.030679022765498835
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456344,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456344
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.016543785026048315,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.016543785026048315
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.023532925431044273,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.023532925431044273
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261426,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261426
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.025829163272757485,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.025829163272757485
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3279742765273312,
"acc_stderr": 0.026664410886937606,
"acc_norm": 0.3279742765273312,
"acc_norm_stderr": 0.026664410886937606
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24468085106382978,
"acc_stderr": 0.025645553622266722,
"acc_norm": 0.24468085106382978,
"acc_norm_stderr": 0.025645553622266722
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2529335071707953,
"acc_stderr": 0.011102268713839987,
"acc_norm": 0.2529335071707953,
"acc_norm_stderr": 0.011102268713839987
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.026799562024887674,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.026799562024887674
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.04122066502878285,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.04122066502878285
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19183673469387755,
"acc_stderr": 0.02520696315422538,
"acc_norm": 0.19183673469387755,
"acc_norm_stderr": 0.02520696315422538
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2885572139303483,
"acc_stderr": 0.0320384104021332,
"acc_norm": 0.2885572139303483,
"acc_norm_stderr": 0.0320384104021332
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-virology|5": {
"acc": 0.26506024096385544,
"acc_stderr": 0.03436024037944967,
"acc_norm": 0.26506024096385544,
"acc_norm_stderr": 0.03436024037944967
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237024,
"mc2": 0.4761965022767635,
"mc2_stderr": 0.01637823785885922
},
"harness|winogrande|5": {
"acc": 0.5588003157063931,
"acc_stderr": 0.013954975072834724
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-53000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 667265
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_kaitchup__Maixtchup-4x7b | ---
pretty_name: Evaluation run of kaitchup/Maixtchup-4x7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kaitchup/Maixtchup-4x7b](https://huggingface.co/kaitchup/Maixtchup-4x7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kaitchup__Maixtchup-4x7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-17T16:47:01.392242](https://huggingface.co/datasets/open-llm-leaderboard/details_kaitchup__Maixtchup-4x7b/blob/main/results_2024-01-17T16-47-01.392242.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6144719599933052,\n\
\ \"acc_stderr\": 0.03303924482918558,\n \"acc_norm\": 0.6168692677516201,\n\
\ \"acc_norm_stderr\": 0.03370135211774917,\n \"mc1\": 0.4039167686658507,\n\
\ \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.5612826178367374,\n\
\ \"mc2_stderr\": 0.015986434965174608\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.014370358632472439,\n\
\ \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.014144193471893454\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6525592511451902,\n\
\ \"acc_stderr\": 0.004751840646730854,\n \"acc_norm\": 0.8382792272455686,\n\
\ \"acc_norm_stderr\": 0.003674419799353668\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.028985455652334395,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.028985455652334395\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6064516129032258,\n \"acc_stderr\": 0.027791878753132274,\n \"\
acc_norm\": 0.6064516129032258,\n \"acc_norm_stderr\": 0.027791878753132274\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.02614848346915332,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.02614848346915332\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.024962683564331796,\n\
\ \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.024962683564331796\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.03149930577784906,\n \
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.03149930577784906\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.01714985851425095,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.01714985851425095\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.47685185185185186,\n \"acc_stderr\": 0.034063153607115065,\n\
\ \"acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.034063153607115065\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145635,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145635\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082393,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082393\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381396,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381396\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3776536312849162,\n\
\ \"acc_stderr\": 0.01621414875213663,\n \"acc_norm\": 0.3776536312849162,\n\
\ \"acc_norm_stderr\": 0.01621414875213663\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603742,\n\
\ \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603742\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873862,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873862\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n\
\ \"acc_stderr\": 0.012687818419599924,\n \"acc_norm\": 0.44328552803129073,\n\
\ \"acc_norm_stderr\": 0.012687818419599924\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.02895975519682487,\n\
\ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.02895975519682487\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000325,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000325\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5572139303482587,\n\
\ \"acc_stderr\": 0.03512310964123937,\n \"acc_norm\": 0.5572139303482587,\n\
\ \"acc_norm_stderr\": 0.03512310964123937\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4039167686658507,\n\
\ \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.5612826178367374,\n\
\ \"mc2_stderr\": 0.015986434965174608\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7600631412786109,\n \"acc_stderr\": 0.01200207862948574\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5481425322213799,\n \
\ \"acc_stderr\": 0.013708494995677651\n }\n}\n```"
repo_url: https://huggingface.co/kaitchup/Maixtchup-4x7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|arc:challenge|25_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|gsm8k|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hellaswag|10_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T16-47-01.392242.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T16-47-01.392242.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- '**/details_harness|winogrande|5_2024-01-17T16-47-01.392242.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-17T16-47-01.392242.parquet'
- config_name: results
data_files:
- split: 2024_01_17T16_47_01.392242
path:
- results_2024-01-17T16-47-01.392242.parquet
- split: latest
path:
- results_2024-01-17T16-47-01.392242.parquet
---
# Dataset Card for Evaluation run of kaitchup/Maixtchup-4x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kaitchup/Maixtchup-4x7b](https://huggingface.co/kaitchup/Maixtchup-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kaitchup__Maixtchup-4x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T16:47:01.392242](https://huggingface.co/datasets/open-llm-leaderboard/details_kaitchup__Maixtchup-4x7b/blob/main/results_2024-01-17T16-47-01.392242.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6144719599933052,
"acc_stderr": 0.03303924482918558,
"acc_norm": 0.6168692677516201,
"acc_norm_stderr": 0.03370135211774917,
"mc1": 0.4039167686658507,
"mc1_stderr": 0.017177276822584284,
"mc2": 0.5612826178367374,
"mc2_stderr": 0.015986434965174608
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.014370358632472439,
"acc_norm": 0.6254266211604096,
"acc_norm_stderr": 0.014144193471893454
},
"harness|hellaswag|10": {
"acc": 0.6525592511451902,
"acc_stderr": 0.004751840646730854,
"acc_norm": 0.8382792272455686,
"acc_norm_stderr": 0.003674419799353668
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.028985455652334395,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.028985455652334395
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6064516129032258,
"acc_stderr": 0.027791878753132274,
"acc_norm": 0.6064516129032258,
"acc_norm_stderr": 0.027791878753132274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.02614848346915332,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.02614848346915332
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.024962683564331796,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.024962683564331796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.03149930577784906,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.03149930577784906
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.01714985851425095,
"acc_norm": 0.8,
"acc_norm_stderr": 0.01714985851425095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.034063153607115065,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.034063153607115065
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145635
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082393,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082393
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381396,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381396
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3776536312849162,
"acc_stderr": 0.01621414875213663,
"acc_norm": 0.3776536312849162,
"acc_norm_stderr": 0.01621414875213663
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603742,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603742
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873862,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873862
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.012687818419599924,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.012687818419599924
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.02895975519682487,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.02895975519682487
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5572139303482587,
"acc_stderr": 0.03512310964123937,
"acc_norm": 0.5572139303482587,
"acc_norm_stderr": 0.03512310964123937
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4039167686658507,
"mc1_stderr": 0.017177276822584284,
"mc2": 0.5612826178367374,
"mc2_stderr": 0.015986434965174608
},
"harness|winogrande|5": {
"acc": 0.7600631412786109,
"acc_stderr": 0.01200207862948574
},
"harness|gsm8k|5": {
"acc": 0.5481425322213799,
"acc_stderr": 0.013708494995677651
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
sharc_modified | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
- expert-generated
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|sharc
task_categories:
- question-answering
task_ids:
- extractive-qa
paperswithcode_id: null
pretty_name: SharcModified
tags:
- conversational-qa
dataset_info:
- config_name: mod
features:
- name: id
dtype: string
- name: utterance_id
dtype: string
- name: source_url
dtype: string
- name: snippet
dtype: string
- name: question
dtype: string
- name: scenario
dtype: string
- name: history
list:
- name: follow_up_question
dtype: string
- name: follow_up_answer
dtype: string
- name: evidence
list:
- name: follow_up_question
dtype: string
- name: follow_up_answer
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 15138034
num_examples: 21890
- name: validation
num_bytes: 1474239
num_examples: 2270
download_size: 21197271
dataset_size: 16612273
- config_name: mod_dev_multi
features:
- name: id
dtype: string
- name: utterance_id
dtype: string
- name: source_url
dtype: string
- name: snippet
dtype: string
- name: question
dtype: string
- name: scenario
dtype: string
- name: history
list:
- name: follow_up_question
dtype: string
- name: follow_up_answer
dtype: string
- name: evidence
list:
- name: follow_up_question
dtype: string
- name: follow_up_answer
dtype: string
- name: answer
dtype: string
- name: all_answers
sequence: string
splits:
- name: validation
num_bytes: 1553940
num_examples: 2270
download_size: 2006124
dataset_size: 1553940
- config_name: history
features:
- name: id
dtype: string
- name: utterance_id
dtype: string
- name: source_url
dtype: string
- name: snippet
dtype: string
- name: question
dtype: string
- name: scenario
dtype: string
- name: history
list:
- name: follow_up_question
dtype: string
- name: follow_up_answer
dtype: string
- name: evidence
list:
- name: follow_up_question
dtype: string
- name: follow_up_answer
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 15083103
num_examples: 21890
- name: validation
num_bytes: 1468604
num_examples: 2270
download_size: 21136658
dataset_size: 16551707
- config_name: history_dev_multi
features:
- name: id
dtype: string
- name: utterance_id
dtype: string
- name: source_url
dtype: string
- name: snippet
dtype: string
- name: question
dtype: string
- name: scenario
dtype: string
- name: history
list:
- name: follow_up_question
dtype: string
- name: follow_up_answer
dtype: string
- name: evidence
list:
- name: follow_up_question
dtype: string
- name: follow_up_answer
dtype: string
- name: answer
dtype: string
- name: all_answers
sequence: string
splits:
- name: validation
num_bytes: 1548305
num_examples: 2270
download_size: 2000489
dataset_size: 1548305
---
# Dataset Card for SharcModified
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [More info needed]
- **Repository:** [github](https://github.com/nikhilweee/neural-conv-qa)
- **Paper:** [Neural Conversational QA: Learning to Reason v.s. Exploiting Patterns](https://arxiv.org/abs/1909.03759)
- **Leaderboard:** [More info needed]
- **Point of Contact:** [More info needed]
### Dataset Summary
ShARC, a conversational QA task, requires a system to answer user questions based on rules expressed in natural language text.
However, it is found that in the ShARC dataset there are multiple spurious patterns that could be exploited by neural models.
SharcModified is a new dataset which reduces the patterns identified in the original dataset.
To reduce the sensitivity of neural models, for each occurence of an instance conforming to any of the patterns,
we automatically construct alternatives where we choose to either replace the current instance with an alternative
instance which does not exhibit the pattern; or retain the original instance.
The modified ShARC has two versions sharc-mod and history-shuffled.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The dataset is in english (en).
## Dataset Structure
### Data Instances
Example of one instance:
```
{
"annotation": {
"answer": [
{
"paragraph_reference": {
"end": 64,
"start": 35,
"string": "syndactyly affecting the feet"
},
"sentence_reference": {
"bridge": false,
"end": 64,
"start": 35,
"string": "syndactyly affecting the feet"
}
}
],
"explanation_type": "single_sentence",
"referential_equalities": [
{
"question_reference": {
"end": 40,
"start": 29,
"string": "webbed toes"
},
"sentence_reference": {
"bridge": false,
"end": 11,
"start": 0,
"string": "Webbed toes"
}
}
],
"selected_sentence": {
"end": 67,
"start": 0,
"string": "Webbed toes is the common name for syndactyly affecting the feet . "
}
},
"example_id": 9174646170831578919,
"original_nq_answers": [
{
"end": 45,
"start": 35,
"string": "syndactyly"
}
],
"paragraph_text": "Webbed toes is the common name for syndactyly affecting the feet . It is characterised by the fusion of two or more digits of the feet . This is normal in many birds , such as ducks ; amphibians , such as frogs ; and mammals , such as kangaroos . In humans it is considered unusual , occurring in approximately one in 2,000 to 2,500 live births .",
"question": "what is the medical term for webbed toes",
"sentence_starts": [
0,
67,
137,
247
],
"title_text": "Webbed toes",
"url": "https: //en.wikipedia.org//w/index.php?title=Webbed_toes&oldid=801229780"
}
```
### Data Fields
- `example_id`: a unique integer identifier that matches up with NQ
- `title_text`: the title of the wikipedia page containing the paragraph
- `url`: the url of the wikipedia page containing the paragraph
- `question`: a natural language question string from NQ
- `paragraph_text`: a paragraph string from a wikipedia page containing the answer to question
- `sentence_starts`: a list of integer character offsets indicating the start of sentences in the paragraph
- `original_nq_answers`: the original short answer spans from NQ
- `annotation`: the QED annotation, a dictionary with the following items and further elaborated upon below:
- `referential_equalities`: a list of dictionaries, one for each referential equality link annotated
- `answer`: a list of dictionaries, one for each short answer span
- `selected_sentence`: a dictionary representing the annotated sentence in the passage
- `explanation_type`: one of "single_sentence", "multi_sentence", or "none"
### Data Splits
The dataset is split into training and validation splits.
| | train | validation |
|--------------|------:|-----------:|
| N. Instances | 7638 | 1355 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Unknown.
### Citation Information
```
@misc{lamm2020qed,
title={QED: A Framework and Dataset for Explanations in Question Answering},
author={Matthew Lamm and Jennimaria Palomaki and Chris Alberti and Daniel Andor and Eunsol Choi and Livio Baldini Soares and Michael Collins},
year={2020},
eprint={2009.06354},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@patil-suraj](https://github.com/patil-suraj) for adding this dataset. |
Dahoas/cot_gsm8k_three_step | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 646407.3333795206
num_examples: 605
- name: test
num_bytes: 123083.2153146323
num_examples: 113
- name: val
num_bytes: 24057.4609375
num_examples: 23
download_size: 415155
dataset_size: 793548.0096316529
---
# Dataset Card for "cot_gsm8k_three_step"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
he111111/yelp | ---
license: openrail
---
|
umd-zhou-lab/recycled_alpaca_v1 | ---
dataset_info:
features:
- name: data
struct:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 96478203
num_examples: 52002
download_size: 52032506
dataset_size: 96478203
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "recycled_alpaca_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LeoZotos/sciq_diff_discrete | ---
dataset_info:
features:
- name: question
dtype: string
- name: distractor3
dtype: string
- name: distractor1
dtype: string
- name: distractor2
dtype: string
- name: correct_answer
dtype: string
- name: support
dtype: string
- name: topic
dtype: string
- name: difficulty
dtype: int64
splits:
- name: train
num_bytes: 6828428
num_examples: 11679
- name: validation
num_bytes: 577950
num_examples: 1000
- name: test
num_bytes: 588664
num_examples: 1000
download_size: 4768667
dataset_size: 7995042
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
mxronga/wiki-yo | ---
license: mit
language:
- yo
task_categories:
- text-generation
tags:
- pretrain
---
Wikipedia Yoruba dump 2024 |
mtc/xnli_de_sub_sampled_3000_with_all_gpt-3-5_explanations | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: explanation
dtype: string
splits:
- name: train
num_bytes: 1343062
num_examples: 3000
- name: validation
num_bytes: 504564
num_examples: 2490
- name: test
num_bytes: 1016528
num_examples: 5010
download_size: 1260547
dataset_size: 2864154
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Fred666/ocnli3k | ---
license: gpl-3.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.