datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
jkruk/dw_communities_content | ---
dataset_info:
features:
- name: content
dtype: string
- name: subreddit
dtype: string
splits:
- name: train
num_bytes: 86184647.40351267
num_examples: 579625
download_size: 50409061
dataset_size: 86184647.40351267
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dw_communities_content"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eVaggelia/myFirstDataset | ---
dataset_info:
features:
- name: headline
dtype: string
- name: title_length
dtype: int64
splits:
- name: train
num_bytes: 87889.4055
num_examples: 1079
- name: validation
num_bytes: 9774.54
num_examples: 120
download_size: 0
dataset_size: 97663.9455
---
# Dataset Card for "myFirstDataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
luna-code/langchain | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: api
dtype: string
splits:
- name: train
num_bytes: 4224309.0
num_examples: 2499
download_size: 1639343
dataset_size: 4224309.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BangumiBase/seitokaiyakuindomo | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Seitokai Yakuindomo
This is the image base of bangumi Seitokai Yakuindomo, we detected 32 characters, 7180 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 114 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 1717 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 233 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 48 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 129 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 49 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 52 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 347 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 1238 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 230 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 49 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 88 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 217 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 23 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 935 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 38 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 30 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 20 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 243 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 13 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 708 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 65 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 10 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 14 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 69 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 43 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 23 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 35 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 23 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 145 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 39 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 193 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
marcus2000/HSE_project_VK_NLP | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: sentiment
dtype: string
splits:
- name: train
num_bytes: 425667.1102204409
num_examples: 848
- name: test
num_bytes: 75294.88977955912
num_examples: 150
download_size: 274658
dataset_size: 500962.0
---
# Dataset Card for "HSE_project_VK_NLP"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JeanKaddour/minipile | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5906108510
num_examples: 1000000
- name: validation
num_bytes: 2779386
num_examples: 500
- name: test
num_bytes: 58558191
num_examples: 10000
download_size: 3177432813
dataset_size: 5967446087
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license: other
multilinguality:
- monolingual
pretty_name: MiniPile
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
paperswithcode_id: minipile
---
# Dataset Card for MiniPile
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
[The MiniPile Challenge for Data-Efficient Language Models](https://arxiv.org/abs/2304.08442)
### Dataset Summary
MiniPile is a 6GB subset of the [deduplicated The Pile corpus](https://huggingface.co/datasets/EleutherAI/the_pile_deduplicated). To curate MiniPile, we perform a simple, three-step data filtering process: we (1) infer embeddings for all documents of the Pile, (2) cluster the embedding space using k-means, and (3) filter out low-quality clusters.
The primary motivation for curating MiniPile is that (i) diverse pre-training datasets (like the Pile) are often too large for academic budgets and (ii) most smaller-scale datasets are fairly homogeneous and thereby unrepresentative of contemporary general-purpose language models. MiniPile aims to fill this gap and thereby facilitate data-efficient research on model architectures, training procedures, optimizers, etc.
More details on the MiniPile curation procedure and some pre-training results be found in the [MiniPile paper](https://arxiv.org/abs/2304.08442).
For more details on the Pile corpus, we refer the reader to [the Pile datasheet](https://arxiv.org/abs/2201.07311).
### Languages
English (`EN`)
## Additional Information
### Dataset Curators
MiniPile is a subset of the Pile, curated by Jean Kaddour. The Pile was created by Leo Gao, Stella Biderman, Sid Black, Laurence Golding, Travis Hoppe, Charles Foster, Jason Phang, Horace He, Anish Thite, Noa Nabeshima, Shawn Presser, Connor Leahy.
### Licensing Information
Since MiniPile is a subset of the Pile, the same MIT License holds.
### Citation Information
```
@article{kaddour2023minipile,
title={The MiniPile Challenge for Data-Efficient Language Models},
author={Kaddour, Jean},
journal={arXiv preprint arXiv:2304.08442},
year={2023}
}
@article{gao2020pile,
title={The {P}ile: An 800{GB} dataset of diverse text for language modeling},
author={Gao, Leo and Biderman, Stella and Black, Sid and Golding, Laurence and Hoppe, Travis and Foster, Charles and Phang, Jason and He, Horace and Thite, Anish and Nabeshima, Noa and others},
journal={arXiv preprint arXiv:2101.00027},
year={2020}
}
```
|
anjunhu/naively_captioned_CUB2002011_test_4shot | ---
dataset_info:
features:
- name: text
dtype: string
- name: text_cupl
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 21888972.0
num_examples: 800
download_size: 21817071
dataset_size: 21888972.0
---
# Dataset Card for "naively_captioned_CUB2002011_test_4shot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/rita_ainsworth_sakurasounopetnakanojo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Rita Ainsworth (Sakurasou no Pet na Kanojo)
This is the dataset of Rita Ainsworth (Sakurasou no Pet na Kanojo), containing 105 images and their tags.
The core tags of this character are `blonde_hair, long_hair, blue_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 105 | 128.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rita_ainsworth_sakurasounopetnakanojo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 105 | 93.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rita_ainsworth_sakurasounopetnakanojo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 228 | 185.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rita_ainsworth_sakurasounopetnakanojo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 105 | 128.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rita_ainsworth_sakurasounopetnakanojo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 228 | 237.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rita_ainsworth_sakurasounopetnakanojo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/rita_ainsworth_sakurasounopetnakanojo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, closed_mouth, smile, solo, anime_coloring, looking_at_viewer, upper_body, indoors, portrait, collared_shirt, hair_between_eyes |
| 1 | 7 |  |  |  |  |  | 1girl, anime_coloring, solo, blurry_background, portrait, open_mouth, parted_bangs |
| 2 | 9 |  |  |  |  |  | 1girl, from_side, profile, solo, smile, closed_mouth, indoors |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | smile | solo | anime_coloring | looking_at_viewer | upper_body | indoors | portrait | collared_shirt | hair_between_eyes | blurry_background | open_mouth | parted_bangs | from_side | profile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------|:-------|:-----------------|:--------------------|:-------------|:----------|:-----------|:-----------------|:--------------------|:--------------------|:-------------|:---------------|:------------|:----------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 1 | 7 |  |  |  |  |  | X | | | X | X | | | | X | | | X | X | X | | |
| 2 | 9 |  |  |  |  |  | X | X | X | X | | | | X | | | | | | | X | X |
|
RoryCochrane/fakemon-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 156164016.815
num_examples: 2363
download_size: 156073190
dataset_size: 156164016.815
---
# Dataset Card for "fakemon-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
projectbaraat/kan-eng-Mathematical-0.1 | ---
dataset_info:
features:
- name: input
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 450861385
num_examples: 337092
download_size: 155499390
dataset_size: 450861385
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
rajendrabaskota/ms-marco-text-generation-gptq-20k-40k | ---
dataset_info:
features:
- name: query_id
dtype: int32
- name: answers
sequence: string
- name: passages
struct:
- name: is_selected
sequence: int32
- name: passage_text
sequence: string
- name: url
sequence: string
- name: query
dtype: string
- name: query_type
dtype: string
- name: wellFormedAnswers
sequence: 'null'
- name: ai_answers
dtype: string
- name: query_len
dtype: int64
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 88905516
num_examples: 20000
download_size: 43404194
dataset_size: 88905516
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
adamxyang/1.4b-policy_preference_data_gold_labelled_noisy | ---
dataset_info:
features:
- name: answers
sequence: string
- name: input
dtype: string
- name: instruction
dtype: string
- name: preference
dtype: int64
splits:
- name: train
num_bytes: 27875579
num_examples: 49383
- name: validation
num_bytes: 1139961
num_examples: 2000
download_size: 15731882
dataset_size: 29015540
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
CWKSC/common_voice_11_0-hi-whisper-small | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 6283293032
num_examples: 6540
- name: test
num_bytes: 2780330000
num_examples: 2894
download_size: 0
dataset_size: 9063623032
---
# Dataset Card for "common_voice_11_0-hi-whisper-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nguyenthanhdo/patent_v3.1_merged | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: lang
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 120966768.0893321
num_examples: 100488
download_size: 81294619
dataset_size: 120966768.0893321
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "patent_v3.1_merged"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Hack90/ref_seq_fungi | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
splits:
- name: train
num_bytes: 18718501891
num_examples: 117479
download_size: 8670246250
dataset_size: 18718501891
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zloading/outdated_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 133852.0
num_examples: 10
download_size: 132701
dataset_size: 133852.0
---
# Dataset Card for "outdated_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_qa_title_v5_full_first_permute | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7724566.286747957
num_examples: 4778
- name: validation
num_bytes: 353148
num_examples: 300
download_size: 1331051
dataset_size: 8077714.286747957
---
# Dataset Card for "squad_qa_title_v5_full_first_permute"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
euclaise/reddit-instruct | ---
dataset_info:
features:
- name: post_title
dtype: string
- name: post_text
dtype: string
- name: post_scores
dtype: int64
- name: comment_text
dtype: string
- name: comment_score
dtype: int64
splits:
- name: train
num_bytes: 126565640.88161694
num_examples: 84784
- name: test
num_bytes: 2985602.021174206
num_examples: 2000
download_size: 67560005
dataset_size: 129551242.90279114
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
license: mit
---
Filtered data from the following subreddits:
"AskAcademia",
"AskComputerScience",
"AskEconomics",
"AskProgramming",
"AskScienceFiction",
"AskSocialScience",
"AskStatistics",
"AskTechnology",
"askmath",
"askphilosophy",
"askpsychology",
"askscience",
"changemyview",
"explainlikeimfive" |
open-llm-leaderboard/details_llm-agents__tora-7b-v1.0 | ---
pretty_name: Evaluation run of llm-agents/tora-7b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [llm-agents/tora-7b-v1.0](https://huggingface.co/llm-agents/tora-7b-v1.0) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llm-agents__tora-7b-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-27T12:52:31.057587](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-7b-v1.0/blob/main/results_2023-10-27T12-52-31.057587.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03166946308724832,\n\
\ \"em_stderr\": 0.001793377907859907,\n \"f1\": 0.0924370805369127,\n\
\ \"f1_stderr\": 0.002203336567209257,\n \"acc\": 0.3803074247848667,\n\
\ \"acc_stderr\": 0.008348384971774042\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.03166946308724832,\n \"em_stderr\": 0.001793377907859907,\n\
\ \"f1\": 0.0924370805369127,\n \"f1_stderr\": 0.002203336567209257\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.025018953752843062,\n \
\ \"acc_stderr\": 0.0043020450465642845\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7355958958168903,\n \"acc_stderr\": 0.012394724896983799\n\
\ }\n}\n```"
repo_url: https://huggingface.co/llm-agents/tora-7b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_27T12_52_31.057587
path:
- '**/details_harness|drop|3_2023-10-27T12-52-31.057587.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-27T12-52-31.057587.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_27T12_52_31.057587
path:
- '**/details_harness|gsm8k|5_2023-10-27T12-52-31.057587.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-27T12-52-31.057587.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-34-11.685092.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-34-11.685092.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T14-34-11.685092.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_27T12_52_31.057587
path:
- '**/details_harness|winogrande|5_2023-10-27T12-52-31.057587.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-27T12-52-31.057587.parquet'
- config_name: results
data_files:
- split: 2023_10_10T14_34_11.685092
path:
- results_2023-10-10T14-34-11.685092.parquet
- split: 2023_10_27T12_52_31.057587
path:
- results_2023-10-27T12-52-31.057587.parquet
- split: latest
path:
- results_2023-10-27T12-52-31.057587.parquet
---
# Dataset Card for Evaluation run of llm-agents/tora-7b-v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/llm-agents/tora-7b-v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [llm-agents/tora-7b-v1.0](https://huggingface.co/llm-agents/tora-7b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llm-agents__tora-7b-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-27T12:52:31.057587](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-7b-v1.0/blob/main/results_2023-10-27T12-52-31.057587.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.03166946308724832,
"em_stderr": 0.001793377907859907,
"f1": 0.0924370805369127,
"f1_stderr": 0.002203336567209257,
"acc": 0.3803074247848667,
"acc_stderr": 0.008348384971774042
},
"harness|drop|3": {
"em": 0.03166946308724832,
"em_stderr": 0.001793377907859907,
"f1": 0.0924370805369127,
"f1_stderr": 0.002203336567209257
},
"harness|gsm8k|5": {
"acc": 0.025018953752843062,
"acc_stderr": 0.0043020450465642845
},
"harness|winogrande|5": {
"acc": 0.7355958958168903,
"acc_stderr": 0.012394724896983799
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test_cot_v2-math-db74ac-2016866707 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test_cot_v2
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-1.3b_eval
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test_cot_v2
dataset_config: mathemakitten--winobias_antistereotype_test_cot_v2
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-1.3b_eval
* Dataset: mathemakitten/winobias_antistereotype_test_cot_v2
* Config: mathemakitten--winobias_antistereotype_test_cot_v2
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
Cognitive-Lab/Aya_Telgu | ---
dataset_info:
- config_name: complete_dataset
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 5553365182
num_examples: 4050096
download_size: 1827014858
dataset_size: 5553365182
- config_name: templated_indic_paraphrase
features:
- name: task_type
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 887070
num_examples: 1517
download_size: 304055
dataset_size: 887070
- config_name: templated_indic_sentiment
features:
- name: task_type
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 781847
num_examples: 1156
download_size: 318064
dataset_size: 781847
- config_name: templated_telugu_food
features:
- name: task_type
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 1108509
num_examples: 441
download_size: 312377
dataset_size: 1108509
- config_name: templated_telugu_jokes
features:
- name: task_type
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 966698
num_examples: 929
download_size: 298196
dataset_size: 966698
- config_name: templated_telugu_news
features:
- name: task_type
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 1150840295
num_examples: 467090
download_size: 423046750
dataset_size: 1150840295
- config_name: templated_telugu_poems
features:
- name: task_type
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 8244805
num_examples: 5115
download_size: 2713407
dataset_size: 8244805
- config_name: templated_telugu_riddles
features:
- name: task_type
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 339040
num_examples: 844
download_size: 79017
dataset_size: 339040
- config_name: templated_xlel_wd
features:
- name: task_type
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 1105593
num_examples: 639
download_size: 403809
dataset_size: 1105593
- config_name: translated_adversarial_qa
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 23828637
num_examples: 10000
download_size: 5853372
dataset_size: 23828637
- config_name: translated_cnn_dailymail
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 624416386
num_examples: 100000
download_size: 228934790
dataset_size: 624416386
- config_name: translated_dolly
features:
- name: task_type
dtype: string
- name: split
dtype: string
- name: script
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 32136437
num_examples: 14808
download_size: 12268225
dataset_size: 32136437
- config_name: translated_flan_coqa
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 42954081
num_examples: 6409
download_size: 15878737
dataset_size: 42954081
- config_name: translated_flan_cot
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 103946965
num_examples: 91910
download_size: 36013799
dataset_size: 103946965
- config_name: translated_flan_gem_wiki
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 171947547
num_examples: 27147
download_size: 61509697
dataset_size: 171947547
- config_name: translated_flan_lambada
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 3350933
num_examples: 4279
download_size: 1244741
dataset_size: 3350933
- config_name: translated_flan_qa
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 466231
num_examples: 540
download_size: 163927
dataset_size: 466231
- config_name: translated_hotpotqa
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 173446675
num_examples: 355476
download_size: 51566169
dataset_size: 173446675
- config_name: translated_joke_explaination
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 1427307
num_examples: 754
download_size: 324060
dataset_size: 1427307
- config_name: translated_mintaka
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 5737422
num_examples: 14000
download_size: 969828
dataset_size: 5737422
- config_name: translated_nqopen
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 55232722
num_examples: 175850
download_size: 15606726
dataset_size: 55232722
- config_name: translated_paws
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 47144986
num_examples: 49401
download_size: 6120004
dataset_size: 47144986
- config_name: translated_piqa
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 19252904
num_examples: 16113
download_size: 5383085
dataset_size: 19252904
- config_name: translated_soda
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 1112271687
num_examples: 1191582
download_size: 309159822
dataset_size: 1112271687
- config_name: translated_wiki_split
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 1111439015
num_examples: 989944
download_size: 326772204
dataset_size: 1111439015
- config_name: translated_wikiqa
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 730463
num_examples: 1040
download_size: 261132
dataset_size: 730463
- config_name: translated_xlel_wd
features:
- name: task_type
dtype: string
- name: script
dtype: string
- name: split
dtype: string
- name: inputs
dtype: string
- name: language
dtype: string
- name: id
dtype: int64
- name: sub_dataset_name
dtype: string
- name: dataset_name
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 859360927
num_examples: 523112
download_size: 320781896
dataset_size: 859360927
configs:
- config_name: complete_dataset
data_files:
- split: train
path: complete_dataset/train-*
- config_name: templated_indic_paraphrase
data_files:
- split: train
path: templated_indic_paraphrase/train-*
- config_name: templated_indic_sentiment
data_files:
- split: train
path: templated_indic_sentiment/train-*
- config_name: templated_telugu_food
data_files:
- split: train
path: templated_telugu_food/train-*
- config_name: templated_telugu_jokes
data_files:
- split: train
path: templated_telugu_jokes/train-*
- config_name: templated_telugu_news
data_files:
- split: train
path: templated_telugu_news/train-*
- config_name: templated_telugu_poems
data_files:
- split: train
path: templated_telugu_poems/train-*
- config_name: templated_telugu_riddles
data_files:
- split: train
path: templated_telugu_riddles/train-*
- config_name: templated_xlel_wd
data_files:
- split: train
path: templated_xlel_wd/train-*
- config_name: translated_adversarial_qa
data_files:
- split: train
path: translated_adversarial_qa/train-*
- config_name: translated_cnn_dailymail
data_files:
- split: train
path: translated_cnn_dailymail/train-*
- config_name: translated_dolly
data_files:
- split: train
path: translated_dolly/train-*
- config_name: translated_flan_coqa
data_files:
- split: train
path: translated_flan_coqa/train-*
- config_name: translated_flan_cot
data_files:
- split: train
path: translated_flan_cot/train-*
- config_name: translated_flan_gem_wiki
data_files:
- split: train
path: translated_flan_gem_wiki/train-*
- config_name: translated_flan_lambada
data_files:
- split: train
path: translated_flan_lambada/train-*
- config_name: translated_flan_qa
data_files:
- split: train
path: translated_flan_qa/train-*
- config_name: translated_hotpotqa
data_files:
- split: train
path: translated_hotpotqa/train-*
- config_name: translated_joke_explaination
data_files:
- split: train
path: translated_joke_explaination/train-*
- config_name: translated_mintaka
data_files:
- split: train
path: translated_mintaka/train-*
- config_name: translated_nqopen
data_files:
- split: train
path: translated_nqopen/train-*
- config_name: translated_paws
data_files:
- split: train
path: translated_paws/train-*
- config_name: translated_piqa
data_files:
- split: train
path: translated_piqa/train-*
- config_name: translated_soda
data_files:
- split: train
path: translated_soda/train-*
- config_name: translated_wiki_split
data_files:
- split: train
path: translated_wiki_split/train-*
- config_name: translated_wikiqa
data_files:
- split: train
path: translated_wikiqa/train-*
- config_name: translated_xlel_wd
data_files:
- split: train
path: translated_xlel_wd/train-*
license: apache-2.0
language:
- en
- te
---
# Aya_Telgu
This Dataset is curated from the original [Aya-Collection](https://huggingface.co/datasets/CohereForAI/aya_collection) dataset that was open-sourced by [Cohere](https://cohere.com/research) under the [Apache-2.0](https://choosealicense.com/licenses/apache-2.0/) license.
The Aya Collection is a massive multilingual collection comprising 513 million instances of prompts and completions that cover a wide range of tasks. This collection uses instruction-style templates from fluent speakers and applies them to a curated list of datasets. It also includes translations of instruction-style datasets into 101 languages. The Aya Dataset, a human-curated multilingual instruction and response dataset, is part of this collection. Refer to our paper for more details about the collection.
### Motivations & Intentions
The original dataset is large and more task-specific than language-specific. To carry out a task specific to the Indic language, one would previously have needed to download the entire dataset (~600 GB) and filter it.
As we were training an Indic LLm internally, we filtered the dataset by language and curated this dataset.
You can find all the Indic-language specific datasets - [here](https://huggingface.co/collections/Cognitive-Lab/aya-indic-suite-65eaa0e34a2307f30bbd55e5).
## **Data Instances**
An example of a `train` instance looks as follows:
```yaml
{'id': 246001,
'inputs': 'The following query in English is taken from the geography category. What could be the answer to the question?\nWhat is the seventh tallest mountain in North America?',
'targets': 'The answer is Mount Lucania.',
'dataset_name': 'Mintaka-inst',
'sub_dataset_name': '-',
'task_type': 'question-answering',
'template_id': 3,
'language': 'eng',
'split': 'train',
'script': 'Latn'
}
```
## **Data Fields**
The data fields are the same among all splits:
- `id:` Unique id of the data point
- `inputs:` Prompt or input to the language model.
- `targets:` Completion or output of the language model.
- `dataset_name:` The name of the source dataset that the data point was taken from
- `sub_dataset_name:` If the source is a collection, this field indicates which part of that collection the data point was taken from. If it is not a collection, this field is left blank.
- `task_type:` The task type that this conversation belongs to.
- `template_id`: The id of the template applied to this data point.
- `language:` The ISO code of the dialect of the conversation.
- `script:` The script of the language.
- `split:` Indicates whether the data point is part of the `train` or the `test` split.
## **Licensing Information**
This dataset can be used for any purpose, whether academic or commercial, under the terms of the **[Apache 2.0](https://opensource.org/license/apache-2-0)** License.
Citation
```yaml
@misc{singh2024aya,
title={Aya Dataset: An Open-Access Collection for Multilingual Instruction Tuning},
author={Shivalika Singh and Freddie Vargus and Daniel Dsouza and Börje F. Karlsson and Abinaya Mahendiran and Wei-Yin Ko and Herumb Shandilya and Jay Patel and Deividas Mataciunas and Laura OMahony and Mike Zhang and Ramith Hettiarachchi and Joseph Wilson and Marina Machado and Luisa Souza Moura and Dominik Krzemiński and Hakimeh Fadaei and Irem Ergün and Ifeoma Okoh and Aisha Alaagib and Oshan Mudannayake and Zaid Alyafeai and Vu Minh Chien and Sebastian Ruder and Surya Guthikonda and Emad A. Alghamdi and Sebastian Gehrmann and Niklas Muennighoff and Max Bartolo and Julia Kreutzer and Ahmet Üstün and Marzieh Fadaee and Sara Hooker},
year={2024},
eprint={2402.06619},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
CVasNLPExperiments/StanfordCars_test_google_flan_t5_xl_mode_A_ns_8041 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 3873420
num_examples: 8041
download_size: 859959
dataset_size: 3873420
---
# Dataset Card for "StanfordCars_test_google_flan_t5_xl_mode_A_ns_8041"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ssunbell/boostcamp-docvqa-v4-test | ---
dataset_info:
features:
- name: questionId
dtype: int64
- name: question
dtype: string
- name: image
sequence:
sequence:
sequence:
sequence: uint8
- name: docId
dtype: int64
- name: ucsf_document_id
dtype: string
- name: ucsf_document_page_no
dtype: string
- name: data_split
dtype: string
- name: words
sequence: string
- name: boxes
sequence:
sequence: int64
splits:
- name: test
num_bytes: 843104716
num_examples: 5188
download_size: 297218666
dataset_size: 843104716
---
# Dataset Card for "boostcamp-docvqa-v4-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Aayush196/funsd_lmv2 | ---
license: mit
---
|
open-llm-leaderboard/details_TheBloke__orca_mini_v3_13B-GPTQ | ---
pretty_name: Evaluation run of TheBloke/orca_mini_v3_13B-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/orca_mini_v3_13B-GPTQ](https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__orca_mini_v3_13B-GPTQ\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T12:38:59.699618](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__orca_mini_v3_13B-GPTQ/blob/main/results_2023-12-04T12-38-59.699618.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5613401785987572,\n\
\ \"acc_stderr\": 0.033576900106646816,\n \"acc_norm\": 0.5663280514839403,\n\
\ \"acc_norm_stderr\": 0.0342786577705747,\n \"mc1\": 0.3561811505507956,\n\
\ \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.4922092515317753,\n\
\ \"mc2_stderr\": 0.015510989644544924\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.01435639941800912,\n\
\ \"acc_norm\": 0.6194539249146758,\n \"acc_norm_stderr\": 0.014188277712349814\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.617805218084047,\n\
\ \"acc_stderr\": 0.004849306998727771,\n \"acc_norm\": 0.81557458673571,\n\
\ \"acc_norm_stderr\": 0.003870381199967957\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286637,\n\
\ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.041124909746707884,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.041124909746707884\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.4913294797687861,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.03266204299064678,\n\
\ \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.03266204299064678\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523846,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523846\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557836,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557836\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n\
\ \"acc_stderr\": 0.02732754844795754,\n \"acc_norm\": 0.6387096774193548,\n\
\ \"acc_norm_stderr\": 0.02732754844795754\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486519,\n\
\ \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486519\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624527,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624527\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.541025641025641,\n \"acc_stderr\": 0.025265525491284295,\n \
\ \"acc_norm\": 0.541025641025641,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606647,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606647\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.032016501007396114,\n\
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.032016501007396114\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7541284403669725,\n \"acc_stderr\": 0.018461940968708436,\n \"\
acc_norm\": 0.7541284403669725,\n \"acc_norm_stderr\": 0.018461940968708436\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0272360139461967,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0272360139461967\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.756066411238825,\n\
\ \"acc_stderr\": 0.015357212665829461,\n \"acc_norm\": 0.756066411238825,\n\
\ \"acc_norm_stderr\": 0.015357212665829461\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895803,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895803\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3787709497206704,\n\
\ \"acc_stderr\": 0.016223533510365113,\n \"acc_norm\": 0.3787709497206704,\n\
\ \"acc_norm_stderr\": 0.016223533510365113\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\
\ \"acc_stderr\": 0.026795422327893934,\n \"acc_norm\": 0.6655948553054662,\n\
\ \"acc_norm_stderr\": 0.026795422327893934\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.0268228017595079,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.0268228017595079\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\
\ \"acc_stderr\": 0.012604960816087377,\n \"acc_norm\": 0.4198174706649283,\n\
\ \"acc_norm_stderr\": 0.012604960816087377\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324227,\n \
\ \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324227\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6616915422885572,\n\
\ \"acc_stderr\": 0.033455630703391935,\n \"acc_norm\": 0.6616915422885572,\n\
\ \"acc_norm_stderr\": 0.033455630703391935\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.03218093795602357,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.03218093795602357\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3561811505507956,\n\
\ \"mc1_stderr\": 0.016763790728446335,\n \"mc2\": 0.4922092515317753,\n\
\ \"mc2_stderr\": 0.015510989644544924\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174789\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.29492039423805916,\n \
\ \"acc_stderr\": 0.01256069801095475\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|arc:challenge|25_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|gsm8k|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hellaswag|10_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T12-38-59.699618.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T12-38-59.699618.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- '**/details_harness|winogrande|5_2023-12-04T12-38-59.699618.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T12-38-59.699618.parquet'
- config_name: results
data_files:
- split: 2023_12_04T12_38_59.699618
path:
- results_2023-12-04T12-38-59.699618.parquet
- split: latest
path:
- results_2023-12-04T12-38-59.699618.parquet
---
# Dataset Card for Evaluation run of TheBloke/orca_mini_v3_13B-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/orca_mini_v3_13B-GPTQ](https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__orca_mini_v3_13B-GPTQ",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T12:38:59.699618](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__orca_mini_v3_13B-GPTQ/blob/main/results_2023-12-04T12-38-59.699618.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5613401785987572,
"acc_stderr": 0.033576900106646816,
"acc_norm": 0.5663280514839403,
"acc_norm_stderr": 0.0342786577705747,
"mc1": 0.3561811505507956,
"mc1_stderr": 0.016763790728446335,
"mc2": 0.4922092515317753,
"mc2_stderr": 0.015510989644544924
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.01435639941800912,
"acc_norm": 0.6194539249146758,
"acc_norm_stderr": 0.014188277712349814
},
"harness|hellaswag|10": {
"acc": 0.617805218084047,
"acc_stderr": 0.004849306998727771,
"acc_norm": 0.81557458673571,
"acc_norm_stderr": 0.003870381199967957
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.029890609686286637,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.029890609686286637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.041124909746707884,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.041124909746707884
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523846,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523846
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557836,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557836
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.02732754844795754,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.02732754844795754
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486519,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486519
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624527,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624527
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.541025641025641,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.541025641025641,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606647,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606647
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.032016501007396114,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.032016501007396114
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7541284403669725,
"acc_stderr": 0.018461940968708436,
"acc_norm": 0.7541284403669725,
"acc_norm_stderr": 0.018461940968708436
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0272360139461967,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0272360139461967
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.756066411238825,
"acc_stderr": 0.015357212665829461,
"acc_norm": 0.756066411238825,
"acc_norm_stderr": 0.015357212665829461
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895803,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3787709497206704,
"acc_stderr": 0.016223533510365113,
"acc_norm": 0.3787709497206704,
"acc_norm_stderr": 0.016223533510365113
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.026795422327893934,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.026795422327893934
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.0268228017595079,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.0268228017595079
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704725,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704725
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4198174706649283,
"acc_stderr": 0.012604960816087377,
"acc_norm": 0.4198174706649283,
"acc_norm_stderr": 0.012604960816087377
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324227,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6616915422885572,
"acc_stderr": 0.033455630703391935,
"acc_norm": 0.6616915422885572,
"acc_norm_stderr": 0.033455630703391935
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.03218093795602357,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.03218093795602357
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3561811505507956,
"mc1_stderr": 0.016763790728446335,
"mc2": 0.4922092515317753,
"mc2_stderr": 0.015510989644544924
},
"harness|winogrande|5": {
"acc": 0.7576953433307024,
"acc_stderr": 0.012042352526174789
},
"harness|gsm8k|5": {
"acc": 0.29492039423805916,
"acc_stderr": 0.01256069801095475
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
venkat-srinivasan-nexusflow/multiapi_prototype_CVECPE_Only_dec11 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prediction
dtype: string
- name: ground_truth
dtype: string
- name: correctness
dtype: int64
splits:
- name: standard
num_bytes: 23976
num_examples: 78
download_size: 13073
dataset_size: 23976
configs:
- config_name: default
data_files:
- split: standard
path: data/standard-*
---
|
HeTree/MevakerConcTree | ---
license: apache-2.0
language:
- he
task_categories:
- zero-shot-classification
---
## MevakerConcTree
A dataset intended for the conclusion allocation task.
The dataset represents several states of pre-allocated conclusions to a given hierarchical heading structure.
### Citing
If you use MevakerConcTree in your research, please cite [Mevaker: Conclusion Extraction and Allocation Resources for the Hebrew Language](https://arxiv.org/abs/2403.09719).
```
@article{shalumov2024mevaker,
title={Mevaker: Conclusion Extraction and Allocation Resources for the Hebrew Language},
author={Vitaly Shalumov and Harel Haskey and Yuval Solaz},
year={2024},
eprint={2403.09719},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
Boliang/math_rephrased_processed_dataset | ---
dataset_info:
features:
- name: Instruction
dtype: string
- name: Response
dtype: string
splits:
- name: train
num_bytes: 36609433
num_examples: 50000
download_size: 14465642
dataset_size: 36609433
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
oe0110/dataset | ---
license: mit
---
|
Mirrar/Longcu | ---
license: mpl-2.0
---
|
ryanramos/vqa-with-coco-img-2 | ---
dataset_info:
features:
- name: license
dtype: int64
- name: file_name
dtype: string
- name: coco_url
dtype: string
- name: height
dtype: int64
- name: width
dtype: int64
- name: date_captured
dtype: string
- name: flickr_url
dtype: string
- name: captions
list:
- name: caption
dtype: string
- name: id
dtype: int64
- name: questions
list:
- name: answer_type
dtype: string
- name: answers
list:
- name: answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: image_id
dtype: int64
- name: multiple_choice_answer
dtype: string
- name: question
dtype: string
- name: question_id
dtype: int64
- name: question_type
dtype: string
- name: image_id
dtype: int64
- name: image
dtype: image
splits:
- name: train
num_bytes: 883496810.5
num_examples: 16500
download_size: 854737087
dataset_size: 883496810.5
---
# Dataset Card for "vqa-with-coco-img-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JWBickel/bible_topics | ---
language:
- en
configs:
- config_name: topics
data_files: "topics.json"
- config_name: refs
data_files: "topics_with_refs.json"
- config_name: scores
data_files: "topic_scores.json"
- config_name: votes
data_files: "topic_votes.json"
---
These are topics with their verse reference. Some of them have cross-references, and some of the cross-references have been voted on.
topic_scores.json and topic_votes.json are both from openbible.info, retrieved November 1, 2023. |
WilliamWen/battery_by_shuhuang | ---
license: apache-2.0
task_categories:
- token-classification
language:
- en
--- |
phongmt184172/mtet | ---
task_categories:
- translation
language:
- en
- vi
size_categories:
- 100M<n<1B
---
load_dataset('phongmt184172/mtet')
The dataset is cloned https://github.com/vietai/mTet for machine translation task. |
mask-distilled-one-sec-cv12/chunk_266 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 917339076
num_examples: 180153
download_size: 931959289
dataset_size: 917339076
---
# Dataset Card for "chunk_266"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ShubhamRS13/indian_food_images | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': burger
'1': butter_naan
'2': chai
'3': chapati
'4': chole_bhature
'5': dal_makhani
'6': dhokla
'7': fried_rice
'8': idli
'9': jalebi
'10': kaathi_rolls
'11': kadai_paneer
'12': kulfi
'13': masala_dosa
'14': momos
'15': paani_puri
'16': pakode
'17': pav_bhaji
'18': pizza
'19': samosa
splits:
- name: train
num_bytes: 1654120641.3594337
num_examples: 5328
- name: test
num_bytes: 222111118.3925666
num_examples: 941
download_size: 1601622156
dataset_size: 1876231759.7520003
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Navarro20/Blu | ---
license: openrail
---
|
Thanmay/implicit_hate-hi | ---
dataset_info:
features:
- name: label
dtype: string
- name: text
dtype: string
- name: target_groups
sequence: string
- name: id
dtype: int64
- name: toxicity_score
dtype: float64
- name: itv2 hi text
dtype: string
splits:
- name: validation
num_bytes: 3207
num_examples: 9
- name: test
num_bytes: 5347462
num_examples: 14191
download_size: 2361315
dataset_size: 5350669
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
el2e10/aya-paraphrase | ---
license: cc
task_categories:
- text-generation
language:
- ml
- gu
- mr
- hi
- pa
- bn
pretty_name: Aya Paraphrase
size_categories:
- 1K<n<10K
configs:
- config_name: default
data_files:
- split: mal
path: data/mal.parquet
- split: ben
path: data/ben.parquet
- split: guj
path: data/guj.parquet
- split: hin
path: data/hin.parquet
- split: mar
path: data/mar.parquet
- split: pan
path: data/pan.parquet
---
### Description
This dataset is derived from the already existing dataset made by AI4Bharat. We have used the [IndicXParaphrase](https://huggingface.co/datasets/ai4bharat/IndicXParaphrase) dataset of AI4Bharat to create this instruction style dataset.
This was created as part of [Aya Open Science Initiative](https://sites.google.com/cohere.com/aya-en/home) from Cohere For AI.
IndicXParaphrase is multilingual, and n-way parallel dataset for paraphrase detection in 10 Indic languages. The original dataset(IndicXParaphrase) was made available under the cc-0 license.
### Template
The following templates where used for converting the original dataset:
```
#Template 1
prompt:
Write the following sentence using different words: "{original_sentence}"
completion:
{paraphrased_sentence}
```
```
#Template 2
prompt:
Rewrite the following sentence in different way: "{original_sentence}"
completion:
{paraphrased_sentence}
```
```
#Template 3
prompt:
Paraphrase the following sentence:: "{original_sentence}"
completion:
{paraphrased_sentence}
```
### Acknowledgement
Thank you, Jay Patel for helping by providing the Gujarati translations, Amarjit for helping by providing the Punjabi translations,
Yogesh Haribhau Kulkarni for helping by providing the Marathi translations,
Ganesh Jagadeesan for helping by providing the Hindi translations and Tahmid Hossain for helping by providing the Bengali translations of the above mentioned English prompts. |
liuyanchen1015/MULTI_VALUE_wnli_conditional_were_was | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: train
num_bytes: 1139
num_examples: 6
download_size: 3309
dataset_size: 1139
---
# Dataset Card for "MULTI_VALUE_wnli_conditional_were_was"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sethapun/arithmetic_2md_1to1 | ---
dataset_info:
features:
- name: expression
dtype: string
- name: answer
dtype: float64
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 54000
num_examples: 2000
- name: validation
num_bytes: 10800
num_examples: 400
download_size: 4984
dataset_size: 64800
---
# Dataset Card for "arithmetic_2md_1to1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jonasantos5240/vozada | ---
license: openrail
---
|
Axel578/dmdmmf | ---
dataset_info:
features:
- name: text
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 2188995.875
num_examples: 2705
download_size: 0
dataset_size: 2188995.875
---
# Dataset Card for "dmdmmf"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Liareizz/JAIRABURNS | ---
license: openrail
---
|
open-llm-leaderboard/details_RWKV__rwkv-4-1b5-pile | ---
pretty_name: Evaluation run of RWKV/rwkv-4-1b5-pile
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RWKV/rwkv-4-1b5-pile](https://huggingface.co/RWKV/rwkv-4-1b5-pile) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RWKV__rwkv-4-1b5-pile\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-12T23:40:07.821046](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-4-1b5-pile/blob/main/results_2023-10-12T23-40-07.821046.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0045092281879194635,\n\
\ \"em_stderr\": 0.0006861346899094988,\n \"f1\": 0.052268246644295364,\n\
\ \"f1_stderr\": 0.0013999730869889793,\n \"acc\": 0.2691397000789266,\n\
\ \"acc_stderr\": 0.007005621297482059\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0045092281879194635,\n \"em_stderr\": 0.0006861346899094988,\n\
\ \"f1\": 0.052268246644295364,\n \"f1_stderr\": 0.0013999730869889793\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5382794001578532,\n\
\ \"acc_stderr\": 0.014011242594964118\n }\n}\n```"
repo_url: https://huggingface.co/RWKV/rwkv-4-1b5-pile
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|arc:challenge|25_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_12T23_40_07.821046
path:
- '**/details_harness|drop|3_2023-10-12T23-40-07.821046.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-12T23-40-07.821046.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_12T23_40_07.821046
path:
- '**/details_harness|gsm8k|5_2023-10-12T23-40-07.821046.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-12T23-40-07.821046.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hellaswag|10_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:09:25.053810.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T05:09:25.053810.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T05:09:25.053810.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_12T23_40_07.821046
path:
- '**/details_harness|winogrande|5_2023-10-12T23-40-07.821046.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-12T23-40-07.821046.parquet'
- config_name: results
data_files:
- split: 2023_09_03T05_09_25.053810
path:
- results_2023-09-03T05:09:25.053810.parquet
- split: 2023_10_12T23_40_07.821046
path:
- results_2023-10-12T23-40-07.821046.parquet
- split: latest
path:
- results_2023-10-12T23-40-07.821046.parquet
---
# Dataset Card for Evaluation run of RWKV/rwkv-4-1b5-pile
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/RWKV/rwkv-4-1b5-pile
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [RWKV/rwkv-4-1b5-pile](https://huggingface.co/RWKV/rwkv-4-1b5-pile) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RWKV__rwkv-4-1b5-pile",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-12T23:40:07.821046](https://huggingface.co/datasets/open-llm-leaderboard/details_RWKV__rwkv-4-1b5-pile/blob/main/results_2023-10-12T23-40-07.821046.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0045092281879194635,
"em_stderr": 0.0006861346899094988,
"f1": 0.052268246644295364,
"f1_stderr": 0.0013999730869889793,
"acc": 0.2691397000789266,
"acc_stderr": 0.007005621297482059
},
"harness|drop|3": {
"em": 0.0045092281879194635,
"em_stderr": 0.0006861346899094988,
"f1": 0.052268246644295364,
"f1_stderr": 0.0013999730869889793
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5382794001578532,
"acc_stderr": 0.014011242594964118
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_nbeerbower__flammen15-gutenberg-DPO-v1-7B | ---
pretty_name: Evaluation run of nbeerbower/flammen15-gutenberg-DPO-v1-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nbeerbower/flammen15-gutenberg-DPO-v1-7B](https://huggingface.co/nbeerbower/flammen15-gutenberg-DPO-v1-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__flammen15-gutenberg-DPO-v1-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-05T21:12:50.334959](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__flammen15-gutenberg-DPO-v1-7B/blob/main/results_2024-04-05T21-12-50.334959.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6543625578367189,\n\
\ \"acc_stderr\": 0.032048363173656905,\n \"acc_norm\": 0.6538258429284167,\n\
\ \"acc_norm_stderr\": 0.03271512738244504,\n \"mc1\": 0.5618115055079559,\n\
\ \"mc1_stderr\": 0.017369236164404413,\n \"mc2\": 0.7198358838748503,\n\
\ \"mc2_stderr\": 0.014630172765675317\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6919795221843004,\n \"acc_stderr\": 0.013491429517292038,\n\
\ \"acc_norm\": 0.7150170648464164,\n \"acc_norm_stderr\": 0.013191348179838795\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7081258713403704,\n\
\ \"acc_stderr\": 0.0045369557965105455,\n \"acc_norm\": 0.8836885082652858,\n\
\ \"acc_norm_stderr\": 0.003199428675985867\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033484,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033484\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608303,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608303\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n\
\ \"acc_stderr\": 0.01657402721951763,\n \"acc_norm\": 0.4335195530726257,\n\
\ \"acc_norm_stderr\": 0.01657402721951763\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.023468429832451156,\n\
\ \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.023468429832451156\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n\
\ \"acc_stderr\": 0.012756161942523367,\n \"acc_norm\": 0.4765319426336376,\n\
\ \"acc_norm_stderr\": 0.012756161942523367\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5618115055079559,\n\
\ \"mc1_stderr\": 0.017369236164404413,\n \"mc2\": 0.7198358838748503,\n\
\ \"mc2_stderr\": 0.014630172765675317\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498431\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7134192570128886,\n \
\ \"acc_stderr\": 0.012454841668337694\n }\n}\n```"
repo_url: https://huggingface.co/nbeerbower/flammen15-gutenberg-DPO-v1-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|arc:challenge|25_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|gsm8k|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hellaswag|10_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-12-50.334959.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T21-12-50.334959.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- '**/details_harness|winogrande|5_2024-04-05T21-12-50.334959.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-05T21-12-50.334959.parquet'
- config_name: results
data_files:
- split: 2024_04_05T21_12_50.334959
path:
- results_2024-04-05T21-12-50.334959.parquet
- split: latest
path:
- results_2024-04-05T21-12-50.334959.parquet
---
# Dataset Card for Evaluation run of nbeerbower/flammen15-gutenberg-DPO-v1-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nbeerbower/flammen15-gutenberg-DPO-v1-7B](https://huggingface.co/nbeerbower/flammen15-gutenberg-DPO-v1-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nbeerbower__flammen15-gutenberg-DPO-v1-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-05T21:12:50.334959](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__flammen15-gutenberg-DPO-v1-7B/blob/main/results_2024-04-05T21-12-50.334959.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6543625578367189,
"acc_stderr": 0.032048363173656905,
"acc_norm": 0.6538258429284167,
"acc_norm_stderr": 0.03271512738244504,
"mc1": 0.5618115055079559,
"mc1_stderr": 0.017369236164404413,
"mc2": 0.7198358838748503,
"mc2_stderr": 0.014630172765675317
},
"harness|arc:challenge|25": {
"acc": 0.6919795221843004,
"acc_stderr": 0.013491429517292038,
"acc_norm": 0.7150170648464164,
"acc_norm_stderr": 0.013191348179838795
},
"harness|hellaswag|10": {
"acc": 0.7081258713403704,
"acc_stderr": 0.0045369557965105455,
"acc_norm": 0.8836885082652858,
"acc_norm_stderr": 0.003199428675985867
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033484,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131154,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669235,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608303,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608303
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.01657402721951763,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.01657402721951763
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.023468429832451156,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.023468429832451156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4765319426336376,
"acc_stderr": 0.012756161942523367,
"acc_norm": 0.4765319426336376,
"acc_norm_stderr": 0.012756161942523367
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5618115055079559,
"mc1_stderr": 0.017369236164404413,
"mc2": 0.7198358838748503,
"mc2_stderr": 0.014630172765675317
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498431
},
"harness|gsm8k|5": {
"acc": 0.7134192570128886,
"acc_stderr": 0.012454841668337694
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
malaysia-ai/mosaic-combine-all | ---
language:
- ms
---
# Mosaic format for combine all dataset to train Malaysian LLM
This repository is to store dataset shards using mosaic format.
1. prepared at https://github.com/malaysia-ai/dedup-text-dataset/blob/main/pretrain-llm/combine-all.ipynb
2. using tokenizer https://huggingface.co/malaysia-ai/bpe-tokenizer
3. 4096 context length.
## how-to
1. git clone,
```bash
git lfs clone https://huggingface.co/datasets/malaysia-ai/mosaic-combine-all
```
2. load it,
```python
from streaming import LocalDataset
import numpy as np
from streaming.base.format.mds.encodings import Encoding, _encodings
class UInt16(Encoding):
def encode(self, obj) -> bytes:
return obj.tobytes()
def decode(self, data: bytes):
return np.frombuffer(data, np.uint16)
_encodings['uint16'] = UInt16
dataset = LocalDataset('mosaic-combine-all')
len(dataset)
``` |
fattah/wepoo | ---
license: ofl-1.1
---
|
AlekseyScorpi/vacancies_promts | ---
license: mit
dataset_info:
features:
- name: prompt
dtype: string
- name: description
dtype: string
splits:
- name: train
num_bytes: 182050998
num_examples: 48564
download_size: 75040493
dataset_size: 182050998
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- text-generation
language:
- ru
tags:
- code
size_categories:
- 10K<n<100K
--- |
open-llm-leaderboard/details_Sao10K__Stheno-Inverted-L2-13B | ---
pretty_name: Evaluation run of Sao10K/Stheno-Inverted-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Stheno-Inverted-L2-13B](https://huggingface.co/Sao10K/Stheno-Inverted-L2-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Stheno-Inverted-L2-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T14:49:52.594706](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-Inverted-L2-13B/blob/main/results_2023-10-24T14-49-52.594706.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.005453020134228188,\n\
\ \"em_stderr\": 0.0007541727796792602,\n \"f1\": 0.08334836409396004,\n\
\ \"f1_stderr\": 0.00173175395556551,\n \"acc\": 0.43967650267207525,\n\
\ \"acc_stderr\": 0.01076620685162581\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.005453020134228188,\n \"em_stderr\": 0.0007541727796792602,\n\
\ \"f1\": 0.08334836409396004,\n \"f1_stderr\": 0.00173175395556551\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13191811978771797,\n \
\ \"acc_stderr\": 0.009321265253857515\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7474348855564326,\n \"acc_stderr\": 0.012211148449394105\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Stheno-Inverted-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|arc:challenge|25_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T14_49_52.594706
path:
- '**/details_harness|drop|3_2023-10-24T14-49-52.594706.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T14-49-52.594706.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T14_49_52.594706
path:
- '**/details_harness|gsm8k|5_2023-10-24T14-49-52.594706.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T14-49-52.594706.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hellaswag|10_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T22:34:24.452875.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T22:34:24.452875.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T22:34:24.452875.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T14_49_52.594706
path:
- '**/details_harness|winogrande|5_2023-10-24T14-49-52.594706.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T14-49-52.594706.parquet'
- config_name: results
data_files:
- split: 2023_08_31T22_34_24.452875
path:
- results_2023-08-31T22:34:24.452875.parquet
- split: 2023_10_24T14_49_52.594706
path:
- results_2023-10-24T14-49-52.594706.parquet
- split: latest
path:
- results_2023-10-24T14-49-52.594706.parquet
---
# Dataset Card for Evaluation run of Sao10K/Stheno-Inverted-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Stheno-Inverted-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Stheno-Inverted-L2-13B](https://huggingface.co/Sao10K/Stheno-Inverted-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Stheno-Inverted-L2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T14:49:52.594706](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Stheno-Inverted-L2-13B/blob/main/results_2023-10-24T14-49-52.594706.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.005453020134228188,
"em_stderr": 0.0007541727796792602,
"f1": 0.08334836409396004,
"f1_stderr": 0.00173175395556551,
"acc": 0.43967650267207525,
"acc_stderr": 0.01076620685162581
},
"harness|drop|3": {
"em": 0.005453020134228188,
"em_stderr": 0.0007541727796792602,
"f1": 0.08334836409396004,
"f1_stderr": 0.00173175395556551
},
"harness|gsm8k|5": {
"acc": 0.13191811978771797,
"acc_stderr": 0.009321265253857515
},
"harness|winogrande|5": {
"acc": 0.7474348855564326,
"acc_stderr": 0.012211148449394105
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r128_a256 | ---
pretty_name: Evaluation run of BFauber/lora_llama2-13b_10e5_r128_a256
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/lora_llama2-13b_10e5_r128_a256](https://huggingface.co/BFauber/lora_llama2-13b_10e5_r128_a256)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r128_a256\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T01:41:13.878952](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r128_a256/blob/main/results_2024-02-10T01-41-13.878952.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5475513509209767,\n\
\ \"acc_stderr\": 0.03364466522313527,\n \"acc_norm\": 0.5535754457106707,\n\
\ \"acc_norm_stderr\": 0.03437240325899414,\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.37818229047033813,\n\
\ \"mc2_stderr\": 0.01371187114283475\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5674061433447098,\n \"acc_stderr\": 0.014478005694182528,\n\
\ \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.014332236306790149\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6163114917347142,\n\
\ \"acc_stderr\": 0.004852896681736758,\n \"acc_norm\": 0.8207528380800637,\n\
\ \"acc_norm_stderr\": 0.0038277525727700226\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309174,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309174\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03015113445777629,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03015113445777629\n },\n\
\ \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.041227287076512825,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.041227287076512825\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.044045561573747664,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.044045561573747664\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.041443118108781526,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.041443118108781526\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.02422996529842508,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.02422996529842508\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04006168083848879,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04006168083848879\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n\
\ \"acc_stderr\": 0.02614868593067175,\n \"acc_norm\": 0.6967741935483871,\n\
\ \"acc_norm_stderr\": 0.02614868593067175\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419872,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.033322999210706444,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.033322999210706444\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5205128205128206,\n \"acc_stderr\": 0.02532966316348994,\n \
\ \"acc_norm\": 0.5205128205128206,\n \"acc_norm_stderr\": 0.02532966316348994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.029571601065753374,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.029571601065753374\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.03252113489929187,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.03252113489929187\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697624,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697624\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.02645350805404032,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.02645350805404032\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7458492975734355,\n\
\ \"acc_stderr\": 0.01556925469204576,\n \"acc_norm\": 0.7458492975734355,\n\
\ \"acc_norm_stderr\": 0.01556925469204576\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.0261521986197268,\n\
\ \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.0261521986197268\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26256983240223464,\n\
\ \"acc_stderr\": 0.014716824273017771,\n \"acc_norm\": 0.26256983240223464,\n\
\ \"acc_norm_stderr\": 0.014716824273017771\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.02807415894760065,\n\
\ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.02807415894760065\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.027316847674192717,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.027316847674192717\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.027002521034516475,\n\
\ \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.027002521034516475\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255855,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255855\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4230769230769231,\n\
\ \"acc_stderr\": 0.01261820406658839,\n \"acc_norm\": 0.4230769230769231,\n\
\ \"acc_norm_stderr\": 0.01261820406658839\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.03033257809455504,\n\
\ \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.03033257809455504\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.553921568627451,\n \"acc_stderr\": 0.020109864547181354,\n \
\ \"acc_norm\": 0.553921568627451,\n \"acc_norm_stderr\": 0.020109864547181354\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.031157150869355558,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.031157150869355558\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.37818229047033813,\n\
\ \"mc2_stderr\": 0.01371187114283475\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7632202052091555,\n \"acc_stderr\": 0.011947592365207392\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.21076573161485973,\n \
\ \"acc_stderr\": 0.011234280469030465\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/lora_llama2-13b_10e5_r128_a256
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|arc:challenge|25_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|gsm8k|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hellaswag|10_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-41-13.878952.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T01-41-13.878952.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- '**/details_harness|winogrande|5_2024-02-10T01-41-13.878952.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T01-41-13.878952.parquet'
- config_name: results
data_files:
- split: 2024_02_10T01_41_13.878952
path:
- results_2024-02-10T01-41-13.878952.parquet
- split: latest
path:
- results_2024-02-10T01-41-13.878952.parquet
---
# Dataset Card for Evaluation run of BFauber/lora_llama2-13b_10e5_r128_a256
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/lora_llama2-13b_10e5_r128_a256](https://huggingface.co/BFauber/lora_llama2-13b_10e5_r128_a256) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r128_a256",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T01:41:13.878952](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__lora_llama2-13b_10e5_r128_a256/blob/main/results_2024-02-10T01-41-13.878952.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5475513509209767,
"acc_stderr": 0.03364466522313527,
"acc_norm": 0.5535754457106707,
"acc_norm_stderr": 0.03437240325899414,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.37818229047033813,
"mc2_stderr": 0.01371187114283475
},
"harness|arc:challenge|25": {
"acc": 0.5674061433447098,
"acc_stderr": 0.014478005694182528,
"acc_norm": 0.5972696245733788,
"acc_norm_stderr": 0.014332236306790149
},
"harness|hellaswag|10": {
"acc": 0.6163114917347142,
"acc_stderr": 0.004852896681736758,
"acc_norm": 0.8207528380800637,
"acc_norm_stderr": 0.0038277525727700226
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309174,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309174
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.03015113445777629,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03015113445777629
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.041227287076512825,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.041227287076512825
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.044045561573747664,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.044045561573747664
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.041443118108781526,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.041443118108781526
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.02422996529842508,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.02422996529842508
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848879,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848879
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.02614868593067175,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.02614868593067175
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419872,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.033322999210706444,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.033322999210706444
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5205128205128206,
"acc_stderr": 0.02532966316348994,
"acc_norm": 0.5205128205128206,
"acc_norm_stderr": 0.02532966316348994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.03243718055137411,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.03243718055137411
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7486238532110092,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.7486238532110092,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.03252113489929187,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.03252113489929187
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697624,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697624
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.02645350805404032,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.02645350805404032
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7458492975734355,
"acc_stderr": 0.01556925469204576,
"acc_norm": 0.7458492975734355,
"acc_norm_stderr": 0.01556925469204576
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.0261521986197268,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.0261521986197268
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26256983240223464,
"acc_stderr": 0.014716824273017771,
"acc_norm": 0.26256983240223464,
"acc_norm_stderr": 0.014716824273017771
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.02807415894760065,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.02807415894760065
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.027316847674192717,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.027316847674192717
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.027002521034516475,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.027002521034516475
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255855,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.01261820406658839,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.01261820406658839
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.553921568627451,
"acc_stderr": 0.020109864547181354,
"acc_norm": 0.553921568627451,
"acc_norm_stderr": 0.020109864547181354
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355558,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355558
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.37818229047033813,
"mc2_stderr": 0.01371187114283475
},
"harness|winogrande|5": {
"acc": 0.7632202052091555,
"acc_stderr": 0.011947592365207392
},
"harness|gsm8k|5": {
"acc": 0.21076573161485973,
"acc_stderr": 0.011234280469030465
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
NafishZaldinanda/common_voice_16_0_id_pseudo_labelled | ---
dataset_info:
config_name: id
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 165337917.496
num_examples: 4969
- name: validation
num_bytes: 100791420.5
num_examples: 3340
- name: test
num_bytes: 114085182.172
num_examples: 3642
download_size: 371542085
dataset_size: 380214520.168
configs:
- config_name: id
data_files:
- split: train
path: id/train-*
- split: validation
path: id/validation-*
- split: test
path: id/test-*
---
|
pixparse/idl-wds |
---
license: other
license_name: idl-train
license_link: LICENSE
task_categories:
- image-to-text
size_categories:
- 10M<n<100M
---
# Dataset Card for Industry Documents Library (IDL)
## Dataset Description
- **Point of Contact from curators:** [Kate Tasker, UCSF](mailto:kate.tasker@ucsf.edu)
- **Point of Contact Hugging Face:** [Pablo Montalvo](mailto:pablo@huggingface.co)
### Dataset Summary
Industry Documents Library (IDL) is a document dataset filtered from [UCSF documents library](https://www.industrydocuments.ucsf.edu/) with 19 million pages kept as valid samples.
Each document exists as a collection of a pdf, a tiff image with the same contents rendered, a json file containing extensive Textract OCR annotations from the [idl_data](https://github.com/furkanbiten/idl_data) project, and a .ocr file with the original, older OCR annotation. In each pdf, there may be from 1 to up to 3000 pages.
<center>
<img src="https://huggingface.co/datasets/pixparse/IDL-wds/resolve/main/doc_images/idl_page_example.png" alt="An addendum from an internal legal document" width="600" height="300">
<p><em>An example page of one pdf document from the Industry Documents Library. </em></p>
</center>
This instance of IDL is in [webdataset](https://github.com/webdataset/webdataset/commits/main) .tar format.
### Usage with `chug`
Check out [chug](https://github.com/huggingface/chug), our optimized library for sharded dataset loading!
```python
import chug
task_cfg = chug.DataTaskDocReadCfg(page_sampling='all')
data_cfg = chug.DataCfg(
source='pixparse/idl-wds',
split='train',
batch_size=None,
format='hfids',
num_workers=0,
)
data_loader = chug.create_loader(
data_cfg,
task_cfg,
)
sample = next(iter(data_loader))
```
### Usage with datasets
This dataset can also be used with webdataset library or current releases of Hugging Face `datasets`.
Here is an example using the "streaming" parameter. We do recommend downloading the dataset to save bandwidth.
```python
dataset = load_dataset('pixparse/idl-wds', streaming=True)
print(next(iter(dataset['train'])).keys())
>> dict_keys(['__key__', '__url__', 'json', 'ocr', 'pdf', 'tif'])
```
For faster download, you can directly use the `huggingface_hub` library. Make sure `hf_transfer` is installed prior to downloading and mind that you have enough space locally.
```python
import os
os.environ["HF_HUB_ENABLE_HF_TRANSFER"] = "1"
from huggingface_hub import HfApi, logging
#logging.set_verbosity_debug()
hf = HfApi()
hf.snapshot_download("pixparse/idl-wds", repo_type="dataset", local_dir_use_symlinks=False)
```
Further, a metadata file `_pdfa-english-train-info-minimal.json` contains the list of samples per shard, with same basename and `.json` or `.pdf` extension,
as well as the count of files per shard.
#### Words and lines document metadata
Initially, we obtained the raw data from the IDL API and combined it with the `idl_data` annotation. This information is then reshaped into lines organized in reading order, under the key lines. We keep non-reshaped word and bounding box information under the word key, should users want to use their own heuristic.
The way we obtain an approximate reading order is simply by looking at the frequency peaks of the leftmost word x-coordinate. A frequency peak means that a high number of lines are starting from the same point. Then, we keep track of the x-coordinate of each such identified column. If no peaks are found, the document is assumed to be readable in plain format.
The code to detect columns can be found here.
```python
def get_columnar_separators(page, min_prominence=0.3, num_bins=10, kernel_width=1):
"""
Identifies the x-coordinates that best separate columns by analyzing the derivative of a histogram
of the 'left' values (xmin) of bounding boxes.
Args:
page (dict): Page data with 'bbox' containing bounding boxes of words.
min_prominence (float): The required prominence of peaks in the histogram.
num_bins (int): Number of bins to use for the histogram.
kernel_width (int): The width of the Gaussian kernel used for smoothing the histogram.
Returns:
separators (list): The x-coordinates that separate the columns, if any.
"""
try:
left_values = [b[0] for b in page['bbox']]
hist, bin_edges = np.histogram(left_values, bins=num_bins)
hist = scipy.ndimage.gaussian_filter1d(hist, kernel_width)
min_val = min(hist)
hist = np.insert(hist, [0, len(hist)], min_val)
bin_width = bin_edges[1] - bin_edges[0]
bin_edges = np.insert(bin_edges, [0, len(bin_edges)], [bin_edges[0] - bin_width, bin_edges[-1] + bin_width])
peaks, _ = scipy.signal.find_peaks(hist, prominence=min_prominence * np.max(hist))
derivatives = np.diff(hist)
separators = []
if len(peaks) > 1:
# This finds the index of the maximum derivative value between peaks
# which indicates peaks after trough --> column
for i in range(len(peaks)-1):
peak_left = peaks[i]
peak_right = peaks[i+1]
max_deriv_index = np.argmax(derivatives[peak_left:peak_right]) + peak_left
separator_x = bin_edges[max_deriv_index + 1]
separators.append(separator_x)
except Exception as e:
separators = []
return separators
```
That way, columnar documents can be better separated. This is a basic heuristic but it should improve overall the readability of the documents.
<div style="text-align: center;">
<img src="https://huggingface.co/datasets/pixparse/IDL-wds/resolve/main/doc_images/bounding_boxes_straight.png" alt="Numbered bounding boxes on a document" style="width: 600px; height: 800px; object-fit: cover; display: inline-block;">
<img src="https://huggingface.co/datasets/pixparse/IDL-wds/resolve/main/doc_images/arrows_plot_straight.png" alt="A simple representation of reading order" style="width: 600px; height: 800px; object-fit: cover; display: inline-block;">
</div>
<p style="text-align: center;"><em>Standard reading order for a single-column document. On the left, bounding boxes are ordered, and on the right a rendition of the corresponding reading order is given.</em></p>
<div style="text-align: center;">
<img src="https://huggingface.co/datasets/pixparse/IDL-wds/resolve/main/doc_images/bounding_boxes.png" alt="Numbered bounding boxes on a document" style="width: 600px; height: 800px; object-fit: cover; display: inline-block;">
<img src="https://huggingface.co/datasets/pixparse/IDL-wds/resolve/main/doc_images/arrows_plot.png" alt="A simple representation of reading order" style="width: 600px; height: 800px; object-fit: cover; display: inline-block;">
</div>
<p style="text-align: center;"><em>Heuristic-driven columnar reading order for a two-columns document. On the left, bounding boxes are ordered, and on the right a rendition of the corresponding reading order is given. Some inaccuracies remain but the overall reading order is preserved.</em></p>
For each pdf document, we store statistics on number of pages per shard, number of valid samples per shard. A valid sample is a sample that can be encoded then decoded, which we did for each sample.
### Data, metadata and statistics.
<center>
<img src="https://huggingface.co/datasets/pixparse/IDL-wds/resolve/main/doc_images/idl_page_example.png" alt="An addendum from an internal legal document" width="600" height="300">
<p><em>An example page of one pdf document from the Industry Documents Library. </em></p>
</center>
The metadata for each document has been formatted in this way. Each `pdf` is paired with a `json` file with the following structure. Entries have been shortened for readability.
```json
{
"pages": [
{
"text": [
"COVIDIEN",
"Mallinckrodt",
"Addendum",
"This Addendum to the Consulting Agreement (the \"Agreement\") of July 28, 2010 (\"Effective Date\") by",
"and between David Brushwod, R.Ph., J.D., with an address at P.O. Box 100496, Gainesville, FL 32610-",
],
"bbox": [
[0.185964, 0.058857, 0.092199, 0.011457],
[0.186465, 0.079529, 0.087209, 0.009247],
[0.459241, 0.117854, 0.080015, 0.011332],
[0.117109, 0.13346, 0.751004, 0.014365],
[0.117527, 0.150306, 0.750509, 0.012954]
],
"poly": [
[
{"X": 0.185964, "Y": 0.058857}, {"X": 0.278163, "Y": 0.058857}, {"X": 0.278163, "Y": 0.070315}, {"X": 0.185964, "Y": 0.070315}
],
[
{"X": 0.186465, "Y": 0.079529}, {"X": 0.273673, "Y": 0.079529}, {"X": 0.273673, "Y": 0.088777}, {"X": 0.186465, "Y": 0.088777}
],
[
{"X": 0.459241, "Y": 0.117854}, {"X": 0.539256, "Y": 0.117854}, {"X": 0.539256, "Y": 0.129186}, {"X": 0.459241, "Y": 0.129186}
],
[
{"X": 0.117109, "Y": 0.13346}, {"X": 0.868113, "Y": 0.13346}, {"X": 0.868113, "Y": 0.147825}, {"X": 0.117109, "Y": 0.147825}
],
[
{"X": 0.117527, "Y": 0.150306}, {"X": 0.868036, "Y": 0.150306}, {"X": 0.868036, "Y": 0.163261}, {"X": 0.117527, "Y": 0.163261}
]
],
"score": [
0.9939, 0.5704, 0.9961, 0.9898, 0.9935
]
}
]
}
```
The top-level key, `pages`, is a list of every page in the document. The above example shows only one page. `text` is a list of lines in the document, with their individual associated bounding box in the next entry. `bbox` contains the bounding box coordinates in `left, top, width, height` format, with coordinates relative to the page size. `poly` is the corresponding polygon.
`score` is the confidence score for each line obtained with Textract.
### Data Splits
#### Train
* `idl-train-*.tar`
* Downloaded on 2023/12/16
* 3000 shards, 3144726 samples, 19174595 pages
## Additional Information
### Dataset Curators
Pablo Montalvo, Ross Wightman
### Licensing Information
While the Industry Documents Library is a public archive of documents and audiovisual materials, companies or individuals hold the rights to the information they created, meaning material cannot be “substantially” reproduced in books or other media without the copyright holder’s permission.
The use of copyrighted material, including reproduction, is governed by United States copyright law (Title 17, United States Code). The law may permit the “fair use” of a copyrighted work, including the making of a photocopy, “for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship or research.” 17 U.S.C. § 107.
The Industry Documents Library makes its collections available under court-approved agreements with the rightsholders or under the fair use doctrine, depending on the collection.
According to the US Copyright Office, when determining whether a particular use comes under “fair use” you must consider the following:
the purpose and character of the use, including whether it is of commercial nature or for nonprofit educational purposes;
the nature of the copyrighted work itself;
how much of the work you are using in relation to the copyrighted work as a whole (1 page of a 1000 page work or 1 print advertisement vs. an entire 30 second advertisement);
the effect of the use upon the potential market for or value of the copyrighted work. (For additional information see the US Copyright Office Fair Use Index).
Each user of this website is responsible for ensuring compliance with applicable copyright laws. Persons obtaining, or later using, a copy of copyrighted material in excess of “fair use” may become liable for copyright infringement. By accessing this website, the user agrees to hold harmless the University of California, its affiliates and their directors, officers, employees and agents from all claims and expenses, including attorneys’ fees, arising out of the use of this website by the user.
For more in-depth information on copyright and fair use, visit the [Stanford University Libraries’ Copyright and Fair Use website.](https://fairuse.stanford.edu/)
If you hold copyright to a document or documents in our collections and have concerns about our inclusion of this material, please see the IDL Take-Down Policy or contact us with any questions.
In the dataset, the API from the Industry Documents Library holds the following permissions counts per file, showing all are now public (none are "confidential" or "privileged", only formerly.)
```json
{'public/no restrictions': 3005133,
'public/formerly confidential': 264978,
'public/formerly privileged': 30063,
'public/formerly privileged/formerly confidential': 669,
'public/formerly confidential/formerly privileged': 397,
}
```
|
Nexdata/5156_Images_Mathematical_Formula_Handwriting_OCR_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
208 Vietnamese– 2D Living_Face & Anti_Spoofing Data. The collection scenes include indoor and outdoor scenes. The data includes male and female. The age distribution ranges from juvenile to the elderly, the young people and the middle aged are the majorities. The race distribution is Vietnamese. The data includes various expressions, postures, anti-spoofing samples, multiple light conditions, multiple scenes. The data can be used for tasks such as face payment, remote ID authentication, and face unlocking of mobile phone.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1321?source=Huggingface
## Data size
208 people, 210 videos and 76 images for each person
## Population distribution
race distribution: Vietnamese; gender distribution: 109 males, 99 females; age distribution: 21 people under 18 years old, 111 people aged from 18 to 45, 56 people aged from 46 to 60, 20 people over 60 years old
## Collecting environment
indoor 129 people , outdoor 79 people
## Data diversity
various expressions, postures, anti-spoofing samples, multiple light conditions, multiple scenes
## Device
cellphone, Pad
## Collecting time
169 people in daytime, 39 people at night
## Data forma
.MOV or .MP4 or .jpg
## Labeling content
label the person – ID, nationality, gender, age, collecting scene , collecting time, facial action and light condition
## Accuracy rate
based on the accuracy of the actions, the accuracy exceeds 97 %; action naming accuracy exceeds 97 %
# Licensing Information
Commercial License
|
HuggingFaceM4/MMMU_MCQ | Invalid username or password. |
anikak/Indian-Accent-ASR | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 2156274.0
num_examples: 15
download_size: 1795
dataset_size: 2156274.0
---
# Dataset Card for "Indian-Accent-ASR"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test-mathemakitt-63d0bd-1672359217 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test
eval_info:
task: text_zero_shot_classification
model: mathemakitten/opt-125m
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test
dataset_config: mathemakitten--winobias_antistereotype_test
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: mathemakitten/opt-125m
* Dataset: mathemakitten/winobias_antistereotype_test
* Config: mathemakitten--winobias_antistereotype_test
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Tristan](https://huggingface.co/Tristan) for evaluating this model. |
lhallee/HumanPPI_sep_reg | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: SeqA
dtype: string
- name: SeqB
dtype: string
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 25992799
num_examples: 26319
- name: valid
num_bytes: 239522
num_examples: 234
- name: test
num_bytes: 173184
num_examples: 180
download_size: 16123155
dataset_size: 26405505
---
# Dataset Card for "HumanPPI_sep_reg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gagan3012/hindawi | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Noto_Sans_Arabic
'1': Readex_Pro
'2': Amiri
'3': Noto_Kufi_Arabic
'4': Reem_Kufi_Fun
'5': Lateef
'6': Changa
'7': Kufam
'8': ElMessiri
'9': Reem_Kufi
'10': Noto_Naskh_Arabic
'11': Reem_Kufi_Ink
'12': Tajawal
'13': Aref_Ruqaa_Ink
'14': Markazi_Text
'15': IBM_Plex_Sans_Arabic
'16': Vazirmatn
'17': Harmattan
'18': Gulzar
'19': Scheherazade_New
'20': Cairo
'21': Amiri_Quran
'22': Noto_Nastaliq_Urdu
'23': Mada
'24': Aref_Ruqaa
'25': Almarai
'26': Alkalami
'27': Qahiri
splits:
- name: train
num_bytes: 4098675549.992
num_examples: 64624
- name: validation
num_bytes: 459422119.624
num_examples: 7196
download_size: 4536653671
dataset_size: 4558097669.616
---
# Dataset Card for "hindawi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bernabeSanchez/news-qa-summarization-73 | ---
license: mit
task_categories:
- summarization
- question-answering
- text-retrieval
- text-generation
language:
- en
size_categories:
- 0<n<100
---
|
alex-miller/iati-policy-markers | ---
language:
- en
- fr
- es
- de
license: apache-2.0
size_categories:
- 100K<n<1M
task_categories:
- text-classification
pretty_name: International Aid Transparency Initiative (IATI) Policy Marker Dataset
dataset_info:
features:
- name: iati_identifier
dtype: string
- name: reporting_org_ref
dtype: string
- name: text
dtype: string
- name: languages
dtype: string
- name: activity_dates
dtype: string
- name: gender_equality_sig
dtype: float64
- name: environment_sig
dtype: float64
- name: pdgg_sig
dtype: float64
- name: trade_sig
dtype: float64
- name: bio_diversity_sig
dtype: float64
- name: climate_mitigation_sig
dtype: float64
- name: climate_adaptation_sig
dtype: float64
- name: desertification_sig
dtype: float64
- name: rmnch_sig
dtype: float64
- name: drr_sig
dtype: int64
- name: disability_sig
dtype: int64
- name: nutrition_sig
dtype: int64
- name: gender_equality
dtype: bool
- name: environment
dtype: bool
- name: pdgg
dtype: bool
- name: trade
dtype: bool
- name: bio_diversity
dtype: bool
- name: climate_mitigation
dtype: bool
- name: climate_adaptation
dtype: bool
- name: desertification
dtype: bool
- name: rmnch
dtype: bool
- name: drr
dtype: bool
- name: disability
dtype: bool
- name: nutrition
dtype: bool
splits:
- name: train
num_bytes: 816392535
num_examples: 868558
download_size: 288495498
dataset_size: 816392535
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- finance
- climate
---
# International Aid Transparency Initiative (IATI) Policy Marker Dataset
A multi-purpose dataset including all activity title and description text published to IATI with metadata for policy markers.
For more information on IATI policy markers, see the [element page](https://iatistandard.org/en/iati-standard/203/activity-standard/iati-activities/iati-activity/policy-marker/) on the IATI Standard Website.
IATI is a living data source, and this dataset was last updated on March 28, 2024. For the code to generate an updated version of this dataset, please see my Github repository [here](https://github.com/akmiller01/iati-policy-marker-hf-dataset).
For any given policy marker in this dataset, there are two columns. One indicates whether the publisher utilizes the policy marker at all (e.g. `gender_equality`), and the other represents the marker significance (e.g. `gender_equality_sig`).
The language column is pipe separated. It should be mostly ISO 639, but it's a free-text field so there may be other values.
A suggested use would be:
```
from datasets import load_dataset
iati = load_dataset("alex-miller/iati-policy-markers", split="train")
gender_relevant_iati = iati.filter(lambda example: example["gender_equality"]).rename_column("gender_equality_sig", "label")
cols_to_remove = gender_relevant_iati.column_names
cols_to_remove.remove("text")
cols_to_remove.remove("label")
gender_relevant_iati = gender_relevant_iati.remove_columns(cols_to_remove)
dataset = gender_relevant_iati.class_encode_column("label").train_test_split(
test_size=0.2,
stratify_by_column="label",
shuffle=True,
)
``` |
TeamSODA/signal_processing_attacks_test | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: original_transcription
dtype: string
- name: class_label
dtype: int64
splits:
- name: train
num_bytes: 30945820.0
num_examples: 60
download_size: 30289902
dataset_size: 30945820.0
---
# Dataset Card for "signal_processing_attacks_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marup/TakahashiAmatoSongRVC | ---
license: openrail
---
|
tyzhu/squad_qa_title_v5_full_recite_ans_sent_last_permute_rerun | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 8044721.096877931
num_examples: 4778
- name: validation
num_bytes: 413353
num_examples: 300
download_size: 1587878
dataset_size: 8458074.09687793
---
# Dataset Card for "squad_qa_title_v5_full_recite_ans_sent_last_permute_rerun"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
explodinggradients/eli5-test | ---
dataset_info:
features:
- name: context
dtype: string
- name: prompt
dtype: string
- name: ground_truth
sequence: string
- name: references
sequence: 'null'
- name: generated_text
dtype: string
splits:
- name: test_eli5
num_bytes: 1159353
num_examples: 500
download_size: 716889
dataset_size: 1159353
---
# Dataset Card for "eli5-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
baobab-trees/wikipedia-human-retrieval-ja | ---
license: apache-2.0
task_categories:
- question-answering
language:
- ja
size_categories:
- 1K<n<10K
---
# Japanese Wikipedia Human Retrieval dataset
This is a Japanese question answereing dataset with retrieval on Wikipedia articles
by trained human workers.
## Contributors
* [Yusuke Oda](https://github.com/odashi)
defined the dataset specification, data structure, and the scheme of data collection.
* [Baobab, Inc.](https://baobab-trees.com/en)
operated data collection, data checking, and formatting.
## About the dataset
Each entry represents a single QA session:
given a question sentence, the responsible worker tried to search for the appropriate
information from Wikipedia using the search box and/or inner hyperlinks, and constructed
the answer paragraphs according to the search results.
The whole process of each retrieval is recorded manually by the same worker.
All sessions are processed from 2023-12-04 to 2023-12-25 with access to Japanese Wikipedia
(http://ja.wikipedia.org/).
The dataset consists of following data:
* A question sentence
* The final answer paragraph (whole sentence, and chunks with citations)
* List of references with either extracted paragraph or summarization from a Wikipedia
article
## Target situation and limitation
We designed this dataset to ensure that the answers reflect only exact information written in the cited references,
and does not reflect any external information and/or implicit knowledges.
This design is useful to measure/investigate QA tasks with accurate retrieval from the given data source.
Please keep in mind that the dataset is not designed to provide a QA with correct information.
We requested the workers strictly to answer the questions based on only explicit citation from Wikipedia.
That means, the workers should write answers that may be different from their implicit knowledge,
and should leave the answer empty if they couldn't find any information from Wikipedia
even if they know something to answer the questions.
# Dataset chunks
As well as successful sessions with answer paragraphs, we also recorded failed sessions:
the worker failed to construct the answer from the search results.
In this case we recorded at least the retrieval process despite lack of the answer.
We release this version of the dataset with the following dataset chunks:
* "answered" chunk (838 examples): question, answer, and retrieval process
* "not_answered" chunk (433 examples): question and retrieval process (no answer)
## Data structure
Each entry has the following schema:
```js
{
"id": number, // Entry ID
"question": string, // Question sentence
// Answer section
// Absense of this field means that the worker failed to answer the question.
"answer": {
"text": string, // Answer paragraph
// Answer sentences
// These sentences are written by the workers based on the cited references.
// The above answer paragraph is generated by joining all texts in this list.
"sentences": [
{
"text": string, // Answer sentence
"citations": number[], // List of reference IDs for citation
}
],
},
// Reference list
"references": [
{
// Either "search" or "link" field exists.
// Information for direct search (search box on Wikipedia)
"search": {
"keywords": string[], // List of words input into the search box
},
// Information for hyperlinks
"link": {
"referrer": number, // The reference ID at which the worker clicked the hyperlink
}
// Either "page" or "not_found" field exists.
// Extracted content
"page": {
"title": string, // Title of the Wikipedia article
"url": string, // URL of the Wikipedia article
// Either "quote" or "summary" field exists.
// Absense of both field means that the page doesn't contain appropriate data.
// Information for direct quotation
// There could be multiple "page" fields with "quote" subfield if multiple
// sentences are extracted from distant positions in the same page.
"quote": {
"text": string, // Consecutive texts extracted from the article
},
// Information for summarization
"summary": {
"text": string, // Summary text about the page written by the worker.
"method": string, // Description about how the worker wrote the summary.
}
}
// Search result (not found)
"not_found": {
"url": string, // URL of the Wikipedia search results
}
},
],
}
```
Example ("answered" data ID=1):
```json
{
"id": 1,
"question": "経済産業省の役割について知りたい。",
"answer": {
"text": "経済産業省は、日本の行政機関のひとつです。経済および産業の発展ならびに鉱物資源およびエネルギー資源の供給に関する行政を所管しています。民間の経済活力の向上及び対外経済関係の円滑な発展を中心とする経済及び産業の発展並びに鉱物資源及びエネルギー資源の安定的かつ効率的な供給の確保を図るために、マクロ経済政策、産業政策、通商政策、貿易管理業務、産業技術政策、流通政策、エネルギー政策などを所管しています。",
"sentences": [
{
"text": "経済産業省は、日本の行政機関のひとつです。",
"citations": [
0
]
},
{
"text": "経済および産業の発展ならびに鉱物資源およびエネルギー資源の供給に関する行政を所管しています。",
"citations": [
0
]
},
{
"text": "民間の経済活力の向上及び対外経済関係の円滑な発展を中心とする経済及び産業の発展並びに鉱物資源及びエネルギー資源の安定的かつ効率的な供給の確保を図るために、マクロ経済政策、産業政策、通商政策、貿易管理業務、産業技術政策、流通政策、エネルギー政策などを所管しています。",
"citations": [
1
]
}
]
},
"references": [
{
"search": {
"keywords": [
"経済産業省"
]
},
"page": {
"title": "経済産業省",
"url": "https://ja.wikipedia.org/wiki/%E7%B5%8C%E6%B8%88%E7%94%A3%E6%A5%AD%E7%9C%81",
"quote": {
"text": "経済産業省(けいざいさんぎょうしょう、英: Ministry of Economy, Trade and Industry、略称: METI)は、日本の行政機関のひとつ[4]。経済および産業の発展ならびに鉱物資源およびエネルギー資源の供給に関する行政を所管する[注釈 1]。"
}
}
},
{
"search": {
"keywords": [
"経済産業省"
]
},
"page": {
"title": "経済産業省",
"url": "https://ja.wikipedia.org/wiki/%E7%B5%8C%E6%B8%88%E7%94%A3%E6%A5%AD%E7%9C%81",
"quote": {
"text": "経済産業省設置法第3条の定める任務である「民間の経済活力の向上及び対外経済関係の円滑な発展を中心とする経済及び産業の発展並びに鉱物資源及びエネルギー資源の安定的かつ効率的な供給の確保を図ること」を達成するため、マクロ経済政策、産業政策、通商政策、貿易管理業務、産業技術政策、流通政策、エネルギー政策などを所管する。"
}
}
}
]
}
``` |
ohtaman/aozora | ---
dataset_info:
features:
- name: title
dtype: string
- name: author
dtype: string
- name: content
dtype: string
- name: filename
dtype: string
- name: category
dtype: string
- name: short_description
dtype: string
- name: char_kana_type
dtype: string
splits:
- name: train
num_bytes: 704528623.1545657
num_examples: 17006
- name: test
num_bytes: 4142823.8454343504
num_examples: 100
download_size: 393522386
dataset_size: 708671447.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "aozora"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aiancheruk/go_emotions_mini | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: labels
sequence:
class_label:
names:
'0': admiration
'1': amusement
'2': anger
'3': annoyance
'4': approval
'5': caring
'6': confusion
'7': curiosity
'8': desire
'9': disappointment
'10': disapproval
'11': disgust
'12': embarrassment
'13': excitement
'14': fear
'15': gratitude
'16': grief
'17': joy
'18': love
'19': nervousness
'20': optimism
'21': pride
'22': realization
'23': relief
'24': remorse
'25': sadness
'26': surprise
'27': neutral
- name: id
dtype: string
splits:
- name: train
num_bytes: 48653.97373876987
num_examples: 500
- name: validation
num_bytes: 9714.688536675267
num_examples: 100
- name: test
num_bytes: 9663.589460106872
num_examples: 100
download_size: 54811
dataset_size: 68032.25173555201
---
# Dataset Card for "go_emotions_shrinked"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Seanxh/twitter_dataset_1713193800 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 62566
num_examples: 146
download_size: 26873
dataset_size: 62566
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_vicgalle__CarbonBeagle-11B | ---
pretty_name: Evaluation run of vicgalle/CarbonBeagle-11B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [vicgalle/CarbonBeagle-11B](https://huggingface.co/vicgalle/CarbonBeagle-11B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vicgalle__CarbonBeagle-11B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T15:49:25.767199](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__CarbonBeagle-11B/blob/main/results_2024-01-21T15-49-25.767199.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6705898940993614,\n\
\ \"acc_stderr\": 0.03156444593375738,\n \"acc_norm\": 0.6708641524489116,\n\
\ \"acc_norm_stderr\": 0.03221364260604881,\n \"mc1\": 0.5458996328029376,\n\
\ \"mc1_stderr\": 0.01742959309132352,\n \"mc2\": 0.6942950160280645,\n\
\ \"mc2_stderr\": 0.015190819809321073\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.013385021637313572,\n\
\ \"acc_norm\": 0.7184300341296929,\n \"acc_norm_stderr\": 0.013143376735009022\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7221668990240988,\n\
\ \"acc_stderr\": 0.004470152081675125,\n \"acc_norm\": 0.8892650866361282,\n\
\ \"acc_norm_stderr\": 0.0031316226281990814\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361073,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361073\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.034370793441061344,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.034370793441061344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6170212765957447,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48677248677248675,\n \"acc_stderr\": 0.025742297289575142,\n \"\
acc_norm\": 0.48677248677248675,\n \"acc_norm_stderr\": 0.025742297289575142\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n\
\ \"acc_stderr\": 0.022331707611823085,\n \"acc_norm\": 0.8096774193548387,\n\
\ \"acc_norm_stderr\": 0.022331707611823085\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8383838383838383,\n \"acc_stderr\": 0.026225919863629283,\n \"\
acc_norm\": 0.8383838383838383,\n \"acc_norm_stderr\": 0.026225919863629283\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970565,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970565\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136098,\n\
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136098\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8587155963302753,\n \"acc_stderr\": 0.014933868987028084,\n \"\
acc_norm\": 0.8587155963302753,\n \"acc_norm_stderr\": 0.014933868987028084\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.02704462171947408,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.02704462171947408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8396624472573839,\n \"acc_stderr\": 0.02388438092596567,\n \
\ \"acc_norm\": 0.8396624472573839,\n \"acc_norm_stderr\": 0.02388438092596567\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n\
\ \"acc_stderr\": 0.03021683101150877,\n \"acc_norm\": 0.7174887892376681,\n\
\ \"acc_norm_stderr\": 0.03021683101150877\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917669,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917669\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.013890862162876168,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.013890862162876168\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7658959537572254,\n \"acc_stderr\": 0.022797110278071124,\n\
\ \"acc_norm\": 0.7658959537572254,\n \"acc_norm_stderr\": 0.022797110278071124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48044692737430167,\n\
\ \"acc_stderr\": 0.016709709877662,\n \"acc_norm\": 0.48044692737430167,\n\
\ \"acc_norm_stderr\": 0.016709709877662\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7427652733118971,\n\
\ \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.7427652733118971,\n\
\ \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.023468429832451156,\n\
\ \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.023468429832451156\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5212765957446809,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.5212765957446809,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5149934810951761,\n\
\ \"acc_stderr\": 0.01276449320219325,\n \"acc_norm\": 0.5149934810951761,\n\
\ \"acc_norm_stderr\": 0.01276449320219325\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7279411764705882,\n \"acc_stderr\": 0.02703304115168146,\n\
\ \"acc_norm\": 0.7279411764705882,\n \"acc_norm_stderr\": 0.02703304115168146\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7009803921568627,\n \"acc_stderr\": 0.018521756215423027,\n \
\ \"acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.018521756215423027\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7755102040816326,\n \"acc_stderr\": 0.026711430555538405,\n\
\ \"acc_norm\": 0.7755102040816326,\n \"acc_norm_stderr\": 0.026711430555538405\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018512,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018512\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5458996328029376,\n\
\ \"mc1_stderr\": 0.01742959309132352,\n \"mc2\": 0.6942950160280645,\n\
\ \"mc2_stderr\": 0.015190819809321073\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.840568271507498,\n \"acc_stderr\": 0.010288617479454764\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6694465504169825,\n \
\ \"acc_stderr\": 0.012957496367085026\n }\n}\n```"
repo_url: https://huggingface.co/vicgalle/CarbonBeagle-11B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|arc:challenge|25_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|gsm8k|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hellaswag|10_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T15-49-25.767199.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T15-49-25.767199.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- '**/details_harness|winogrande|5_2024-01-21T15-49-25.767199.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T15-49-25.767199.parquet'
- config_name: results
data_files:
- split: 2024_01_21T15_49_25.767199
path:
- results_2024-01-21T15-49-25.767199.parquet
- split: latest
path:
- results_2024-01-21T15-49-25.767199.parquet
---
# Dataset Card for Evaluation run of vicgalle/CarbonBeagle-11B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vicgalle/CarbonBeagle-11B](https://huggingface.co/vicgalle/CarbonBeagle-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vicgalle__CarbonBeagle-11B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T15:49:25.767199](https://huggingface.co/datasets/open-llm-leaderboard/details_vicgalle__CarbonBeagle-11B/blob/main/results_2024-01-21T15-49-25.767199.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6705898940993614,
"acc_stderr": 0.03156444593375738,
"acc_norm": 0.6708641524489116,
"acc_norm_stderr": 0.03221364260604881,
"mc1": 0.5458996328029376,
"mc1_stderr": 0.01742959309132352,
"mc2": 0.6942950160280645,
"mc2_stderr": 0.015190819809321073
},
"harness|arc:challenge|25": {
"acc": 0.7005119453924915,
"acc_stderr": 0.013385021637313572,
"acc_norm": 0.7184300341296929,
"acc_norm_stderr": 0.013143376735009022
},
"harness|hellaswag|10": {
"acc": 0.7221668990240988,
"acc_stderr": 0.004470152081675125,
"acc_norm": 0.8892650866361282,
"acc_norm_stderr": 0.0031316226281990814
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03583496176361073,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03583496176361073
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.034370793441061344,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.034370793441061344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48677248677248675,
"acc_stderr": 0.025742297289575142,
"acc_norm": 0.48677248677248675,
"acc_norm_stderr": 0.025742297289575142
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823085,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823085
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8383838383838383,
"acc_stderr": 0.026225919863629283,
"acc_norm": 0.8383838383838383,
"acc_norm_stderr": 0.026225919863629283
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970565,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970565
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8587155963302753,
"acc_stderr": 0.014933868987028084,
"acc_norm": 0.8587155963302753,
"acc_norm_stderr": 0.014933868987028084
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.02704462171947408,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.02704462171947408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8396624472573839,
"acc_stderr": 0.02388438092596567,
"acc_norm": 0.8396624472573839,
"acc_norm_stderr": 0.02388438092596567
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.03021683101150877,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.03021683101150877
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917669,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917669
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876168,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876168
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7658959537572254,
"acc_stderr": 0.022797110278071124,
"acc_norm": 0.7658959537572254,
"acc_norm_stderr": 0.022797110278071124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48044692737430167,
"acc_stderr": 0.016709709877662,
"acc_norm": 0.48044692737430167,
"acc_norm_stderr": 0.016709709877662
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7427652733118971,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.7427652733118971,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.023468429832451156,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.023468429832451156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5212765957446809,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.5212765957446809,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5149934810951761,
"acc_stderr": 0.01276449320219325,
"acc_norm": 0.5149934810951761,
"acc_norm_stderr": 0.01276449320219325
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7279411764705882,
"acc_stderr": 0.02703304115168146,
"acc_norm": 0.7279411764705882,
"acc_norm_stderr": 0.02703304115168146
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.018521756215423027,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.018521756215423027
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7755102040816326,
"acc_stderr": 0.026711430555538405,
"acc_norm": 0.7755102040816326,
"acc_norm_stderr": 0.026711430555538405
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018512,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018512
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5458996328029376,
"mc1_stderr": 0.01742959309132352,
"mc2": 0.6942950160280645,
"mc2_stderr": 0.015190819809321073
},
"harness|winogrande|5": {
"acc": 0.840568271507498,
"acc_stderr": 0.010288617479454764
},
"harness|gsm8k|5": {
"acc": 0.6694465504169825,
"acc_stderr": 0.012957496367085026
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zche318/microstructure_porosity_scattered | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 7022408.8
num_examples: 3940
download_size: 5502114
dataset_size: 7022408.8
---
# Dataset Card for "microstructure_porosity_scattered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/ambriel_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ambriel/アンブリエル/安比尔 (Arknights)
This is the dataset of ambriel/アンブリエル/安比尔 (Arknights), containing 121 images and their tags.
The core tags of this character are `long_hair, pink_hair, halo, one_side_up, hair_ornament, hairclip, purple_eyes, very_long_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 121 | 206.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ambriel_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 121 | 170.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ambriel_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 317 | 346.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ambriel_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ambriel_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 23 |  |  |  |  |  | 1girl, black_jacket, solo, holding_gun, long_sleeves, looking_at_viewer, black_thighhighs, mouth_hold, black_gloves, pocky, food_in_mouth, sniper_rifle, open_jacket, grey_shirt, energy_wings, black_skirt, black_footwear, choker |
| 1 | 13 |  |  |  |  |  | 1girl, black_jacket, long_sleeves, looking_at_viewer, solo, black_gloves, simple_background, upper_body, open_jacket, pocky, grey_shirt, holding_food, food_in_mouth, mouth_hold, white_background, grey_background, grey_eyes, hand_up, infection_monitor_(arknights) |
| 2 | 5 |  |  |  |  |  | 1girl, black_footwear, black_gloves, black_jacket, black_thighhighs, boots, full_body, long_sleeves, solo, holding_gun, looking_at_viewer, sniper_rifle, energy_wings, mouth_hold, pink_eyes, simple_background, skirt, standing, white_background |
| 3 | 16 |  |  |  |  |  | 1girl, eyewear_on_head, heart-shaped_eyewear, sunglasses, solo, looking_at_viewer, red_one-piece_swimsuit, sleeveless_shirt, tied_shirt, white_shirt, casual_one-piece_swimsuit, official_alternate_costume, bare_shoulders, wrist_scrunchie, holding_food, nail_polish, thigh_strap, black_choker, cowboy_shot, detached_wings, energy_wings, food_in_mouth, infection_monitor_(arknights), pink_nails, simple_background, white_background, blue_sky, ice_cream, mouth_hold, necklace, pink_eyes, thighs, bare_arms, blush, day, frills, large_breasts, outdoors, white_belt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_jacket | solo | holding_gun | long_sleeves | looking_at_viewer | black_thighhighs | mouth_hold | black_gloves | pocky | food_in_mouth | sniper_rifle | open_jacket | grey_shirt | energy_wings | black_skirt | black_footwear | choker | simple_background | upper_body | holding_food | white_background | grey_background | grey_eyes | hand_up | infection_monitor_(arknights) | boots | full_body | pink_eyes | skirt | standing | eyewear_on_head | heart-shaped_eyewear | sunglasses | red_one-piece_swimsuit | sleeveless_shirt | tied_shirt | white_shirt | casual_one-piece_swimsuit | official_alternate_costume | bare_shoulders | wrist_scrunchie | nail_polish | thigh_strap | black_choker | cowboy_shot | detached_wings | pink_nails | blue_sky | ice_cream | necklace | thighs | bare_arms | blush | day | frills | large_breasts | outdoors | white_belt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:--------------|:---------------|:--------------------|:-------------------|:-------------|:---------------|:--------|:----------------|:---------------|:--------------|:-------------|:---------------|:--------------|:-----------------|:---------|:--------------------|:-------------|:---------------|:-------------------|:------------------|:------------|:----------|:--------------------------------|:--------|:------------|:------------|:--------|:-----------|:------------------|:-----------------------|:-------------|:-------------------------|:-------------------|:-------------|:--------------|:----------------------------|:-----------------------------|:-----------------|:------------------|:--------------|:--------------|:---------------|:--------------|:-----------------|:-------------|:-----------|:------------|:-----------|:---------|:------------|:--------|:------|:---------|:----------------|:-----------|:-------------|
| 0 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | | X | X | | X | X | X | X | | X | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | X | | | X | | X | | X | | | X | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 16 |  |  |  |  |  | X | | X | | | X | | X | | | X | | | | X | | | | X | | X | X | | | | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Elaina617/nijika | ---
license: openrail
---
|
lmms-lab/VizWiz-VQA | ---
dataset_info:
features:
- name: question_id
dtype: string
- name: image
dtype: image
- name: question
dtype: string
- name: answers
sequence: string
- name: category
dtype: string
splits:
- name: val
num_bytes: 2097998373.0
num_examples: 4319
- name: test
num_bytes: 3982325314.0
num_examples: 8000
download_size: 6050372614
dataset_size: 6080323687.0
---
# Dataset Card for "VizWiz-VQA"
<p align="center" width="100%">
<img src="https://i.postimg.cc/g0QRgMVv/WX20240228-113337-2x.png" width="100%" height="80%">
</p>
# Large-scale Multi-modality Models Evaluation Suite
> Accelerating the development of large-scale multi-modality models (LMMs) with `lmms-eval`
🏠 [Homepage](https://lmms-lab.github.io/) | 📚 [Documentation](docs/README.md) | 🤗 [Huggingface Datasets](https://huggingface.co/lmms-lab)
# This Dataset
This is a formatted version of [VizWiz-VQA](https://vizwiz.org/tasks-and-datasets/vqa/). It is used in our `lmms-eval` pipeline to allow for one-click evaluations of large multi-modality models.
```
@inproceedings{gurari2018vizwiz,
title={Vizwiz grand challenge: Answering visual questions from blind people},
author={Gurari, Danna and Li, Qing and Stangl, Abigale J and Guo, Anhong and Lin, Chi and Grauman, Kristen and Luo, Jiebo and Bigham, Jeffrey P},
booktitle={Proceedings of the IEEE conference on computer vision and pattern recognition},
pages={3608--3617},
year={2018}
}
```
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
biglam/contentious_contexts | ---
annotations_creators:
- expert-generated
- crowdsourced
language:
- nl
language_creators:
- machine-generated
license:
- cc-by-2.0
multilinguality:
- monolingual
pretty_name: Contentious Contexts Corpus
size_categories:
- 1K<n<10K
source_datasets:
- original
tags:
- newspapers
- historic
- dutch
- problematic
- ConConCor
task_categories:
- text-classification
task_ids:
- sentiment-scoring
- multi-label-classification
---
# Dataset Card for Contentious Contexts Corpus
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [ConConCor](https://github.com/cultural-ai/ConConCor)
- **Repository:** [ConConCor](https://github.com/cultural-ai/ConConCor)
- **Paper:** [N/A]
- **Leaderboard:** [N/A]
- **Point of Contact:** [Jacco van Ossenbruggen](https://github.com/jrvosse)
**Note** One can also find a Datasheet produced by the creators of this dataset as a [PDF document](https://github.com/cultural-ai/ConConCor/blob/main/Dataset/DataSheet.pdf)
### Dataset Summary
This dataset contains extracts from historical Dutch newspapers containing keywords of potentially contentious words (according to present-day sensibilities). The dataset contains multiple annotations per instance, given the option to quantify agreement scores for annotations. This dataset can be used to track how words and their meanings have changed over time
### Supported Tasks and Leaderboards
- `text-classification`: This dataset can be used for tracking how the meanings of words in different contexts have changed and become contentious over time
### Languages
The text in the dataset is in Dutch. The responses are available in both English and Dutch. Suggestions, where present, are only in Dutch. The associated BCP-47 code is `nl`
## Dataset Structure
### Data Instances
```
{
'extract_id': 'H97',
'text': 'en waardoor het eerste doel wordt voorbijgestreefd om voor den 5D5c5Y 5d-5@5j5g5d5e5Z5V5V5c een speciale eigen werkingssfeer te
scheppen.Intusschen is het',
'target': '5D 5c5Y5d-5@5j5g5d5e5Z5V5V5c',
'annotator_responses_english': [
{'id': 'unknown_2a', 'response': 'Not contentious'},
{'id': 'unknown_2b', 'response': 'Contentious according to current standards'},
{'id': 'unknown_2c', 'response': "I don't know"},
{'id': 'unknown_2d', 'response': 'Contentious according to current standards'},
{'id': 'unknown_2e', 'response': 'Not contentious'},
{'id': 'unknown_2f', 'response': "I don't know"},
{'id': 'unknown_2g', 'response': 'Not contentious'}],
'annotator_responses_dutch': [
{'id': 'unknown_2a', 'response': 'Niet omstreden'},
{'id': 'unknown_2b', 'response': 'Omstreden naar huidige maatstaven'},
{'id': 'unknown_2c', 'response': 'Weet ik niet'},
{'id': 'unknown_2d', 'response': 'Omstreden naar huidige maatstaven'},
{'id': 'unknown_2e', 'response': 'Niet omstreden'},
{'id': 'unknown_2f', 'response': 'Weet ik niet'},
{'id': 'unknown_2g', 'response': 'Niet omstreden'}],
'annotator_suggestions': [
{'id': 'unknown_2a', 'suggestion': ''},
{'id': 'unknown_2b', 'suggestion': 'ander ras nodig'},
{'id': 'unknown_2c', 'suggestion': 'personen van ander ras'},
{'id': 'unknown_2d', 'suggestion': ''},
{'id': 'unknown_2e', 'suggestion': ''},
{'id': 'unknown_2f', 'suggestion': ''},
{'id': 'unknown_2g', 'suggestion': 'ras'}]
}
```
### Data Fields
|extract_id|text|target|annotator_responses_english|annotator_responses_dutch|annotator_suggestions|
|---|---|---|---|---|---|
|Unique identifier|Text|Target phrase or word|Response(translated to English)|Response in Dutch|Suggestions, if present|
### Data Splits
Train: 2720
## Dataset Creation
### Curation Rationale
> Cultural heritage institutions recognise the problem of language use in their collections. The cultural objects in archives, libraries, and museums contain words and phrases that are inappropriate in modern society but were used broadly back in times. Such words can be offensive and discriminative. In our work, we use the term "contentious" to refer to all (potentially) inappropriate or otherwise sensitive words. For example, words suggestive of some (implicit or explicit) bias towards or against something. The National Archives of the Netherlands stated that they "explore the possibility of explaining language that was acceptable and common in the past and providing it with contemporary alternatives", meanwhile "keeping the original descriptions [with contentious words], because they give an idea of the time in which they were made or included in the collection". There is a page on the institution website where people can report "offensive language".
### Source Data
#### Initial Data Collection and Normalization
> The queries were run on OCR'd versions of the Europeana Newspaper collection, as provided by the KB National Library of the Netherlands. We limited our pool to text categorised as "article", thus excluding other types of texts such as advertisements and family notices. We then only focused our sample on the 6 decades between 1890-01-01 and 1941-12-31, as this is the period available in the Europeana newspaper corpus. The dataset represents a stratified sample set over target word, decade, and newspaper issue distribution metadata. For the final set of extracts for annotation, we gave extracts sampling weights proportional to their actual probabilities, as estimated from the initial set of extracts via trigram frequencies, rather than sampling uniformly.
#### Who are the source language producers?
[N/A]
### Annotations
#### Annotation process
> The annotation process included 3 stages: pilot annotation, expert annotation, and crowdsourced annotation on the "Prolific" platform. All stages required the participation of Dutch speakers. The pilot stage was intended for testing the annotation layout, the instructions clarity, the number of sentences provided as context, the survey questions, and the difficulty of the task in general. The Dutch-speaking members of the Cultural AI Lab were asked to test the annotation process and give their feedback anonymously using Google Sheets. Six volunteers contributed to the pilot stage, each annotating the same 40 samples where either a context of 3 or 5 sentences surrounding the term were given. An individual annotation sheet had a table layout with 4 options to choose for every sample
> - 'Omstreden'(Contentious)
> - 'Niet omstreden'(Not contentious)
> - 'Weet ik niet'(I don't know)
> - 'Onleesbare OCR'(Illegible OCR)</br>
2 open fields
> - 'Andere omstreden termen in de context'(Other contentious terms in the context)
> - 'Notities'(Notes)</br>
and the instructions in the header. The rows were the samples with the highlighted words, the tickboxes for every option, and 2 empty cells for the open questions. The obligatory part of the annotation was to select one of the 4 options for every sample. Finding other contentious terms in the given sample, leaving notes, and answering 4 additional open questions at the end of the task were optional. Based on the received feedback and the answers to the open questions in the pilot study, the following decisions were made regarding the next, experts' annotation stage:
> - The annotation layout was built in Google Forms as a questionnaire instead of the table layout in Google Sheets to make the data collection and analysis faster as the number of participants would increase;
> - The context window of 5 sentences per sample was found optimal;
> - The number of samples per annotator was increased to 50;
> - The option 'Omstreden' (Contentious) was changed to 'Omstreden naar huidige maatstaven' ('Contentious according to current standards') to clarify that annotators should judge contentiousness of the word's use in context from today's perspective;
> - The annotation instruction was edited to clarify 2 points: (1) that annotators while judging contentiousness should take into account not only a bolded word but also the context surrounding it, and (2) if a word seems even slightly contentious to an annotator, they should choose the option 'Omstreden naar huidige maatstaven' (Contentious according to current standards);
> - The non-required field for every sample 'Notities' (Notes) was removed as there was an open question at the end of the annotation, where participants could leave their comments;
> - Another open question was added at the end of the annotation asking how much time it took to complete the annotation.
#### Who are the annotators?
Volunteers and Expert annotators
### Personal and Sensitive Information
[N/A]
## Considerations for Using the Data
## Accessing the annotations
Each example text has multiple annotations. These annotations may not always agree. There are various approaches one could take to calculate agreement, including a majority vote, rating some annotators more highly, or calculating a score based on the 'votes' of annotators. Since there are many ways of doing this, we have not implemented this as part of the dataset loading script.
An example of how one could generate an "OCR quality rating" based on the number of times an annotator labelled an example with `Illegible OCR`:
```python
from collections import Counter
def calculate_ocr_score(example):
annotator_responses = [response['response'] for response in example['annotator_responses_english']]
counts = Counter(annotator_responses)
bad_ocr_ratings = counts.get("Illegible OCR")
if bad_ocr_ratings is None:
bad_ocr_ratings = 0
return round(1 - bad_ocr_ratings/len(annotator_responses),3)
dataset = dataset.map(lambda example: {"ocr_score":calculate_ocr_score(example)})
```
To take the majority vote (or return a tie) based on whether a example is labelled contentious or not:
```python
def most_common_vote(example):
annotator_responses = [response['response'] for response in example['annotator_responses_english']]
counts = Counter(annotator_responses)
contentious_count = counts.get("Contentious according to current standards")
if not contentious_count:
contentious_count = 0
not_contentious_count = counts.get("Not contentious")
if not not_contentious_count:
not_contentious_count = 0
if contentious_count > not_contentious_count:
return "contentious"
if contentious_count < not_contentious_count:
return "not_contentious"
if contentious_count == not_contentious_count:
return "tied"
```
### Social Impact of Dataset
This dataset can be used to see how words change in meaning over time
### Discussion of Biases
> Due to the nature of the project, some examples used in this documentation may be shocking or offensive. They are provided only as an illustration or explanation of the resulting dataset and do not reflect the opinions of the project team or their organisations.
Since this project was explicitly created to help assess bias, it should be used primarily in the context of assess bias, and methods for detecting bias.
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Cultural AI](https://github.com/cultural-ai)
### Licensing Information
CC-BY
### Citation Information
```
@misc{ContentiousContextsCorpus2021,
author = {Cultural AI},
title = {Contentious Contexts Corpus},
year = {2021},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\\url{https://github.com/cultural-ai/ConConCor}},
}
``` |
result-kand2-sdxl-wuerst-karlo/7c5b039f | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1364
dataset_size: 182
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "7c5b039f"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
panikos/nutrition5k_dataset | ---
license: cc-by-4.0
---
|
ajmangus/qm_mixture_1.0e | ---
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: charlie_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
splits:
- name: train
num_bytes: 72190684
num_examples: 599997
- name: validation
num_bytes: 7209771
num_examples: 59997
- name: test
num_bytes: 7223936
num_examples: 59997
download_size: 19912524
dataset_size: 86624391
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
sankettgorey/layouts_donut | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 1461607176.2173023
num_examples: 5362
- name: test
num_bytes: 182370076.8402208
num_examples: 671
- name: validation
num_bytes: 181812032.0684768
num_examples: 670
download_size: 1524050233
dataset_size: 1825789285.126
---
# Dataset Card for "layouts_donut"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chainyo/rvl-cdip | ---
license: other
---
The RVL-CDIP (Ryerson Vision Lab Complex Document Information Processing) dataset consists of 400,000 grayscale images in 16 classes, with 25,000 images per class. There are 320,000 training images, 40,000 validation images, and 40,000 test images. The images are sized so their largest dimension does not exceed 1000 pixels.
For questions and comments please contact Adam Harley (aharley@scs.ryerson.ca).
The full dataset can be found [here](https://www.cs.cmu.edu/~aharley/rvl-cdip/).
## Labels
0: advertissement
1: budget
2: email
3: file folder
4: form
5: handwritten
6: invoice
7: letter
8: memo
9: news article
10: presentation
11: questionnaire
12: resume
13: scientific publication
14: scientific report
15: specification
## Citation
This dataset is from this [paper](https://www.cs.cmu.edu/~aharley/icdar15/) `A. W. Harley, A. Ufkes, K. G. Derpanis, "Evaluation of Deep Convolutional Nets for Document Image Classification and Retrieval," in ICDAR, 2015`
## License
RVL-CDIP is a subset of IIT-CDIP, which came from the [Legacy Tobacco Document Library](https://www.industrydocuments.ucsf.edu/tobacco/), for which license information can be found [here](https://www.industrydocuments.ucsf.edu/help/copyright/).
## References
1. D. Lewis, G. Agam, S. Argamon, O. Frieder, D. Grossman, and J. Heard, "Building a test collection for complex document information processing," in Proc. 29th Annual Int. ACM SIGIR Conference (SIGIR 2006), pp. 665-666, 2006
2. The Legacy Tobacco Document Library (LTDL), University of California, San Francisco, 2007. http://legacy.library.ucsf.edu/.
|
anderloh/MotorizedTransportSplit5ClassFineTune | ---
dataset_info:
- config_name: 10min
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': Helicopter
'1': Jet
'2': Racecar
'3': Trains
'4': Truck
splits:
- name: train
num_bytes: 29575409.0
num_examples: 132
- name: validation
num_bytes: 33608419.0
num_examples: 150
download_size: 59819577
dataset_size: 63183828.0
- config_name: 20min
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': Helicopter
'1': Jet
'2': Racecar
'3': Trains
'4': Truck
splits:
- name: train
num_bytes: 52429134.0
num_examples: 234
- name: validation
num_bytes: 33608419.0
num_examples: 150
download_size: 79310846
dataset_size: 86037553.0
- config_name: 30min
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': Helicopter
'1': Jet
'2': Racecar
'3': Trains
'4': Truck
splits:
- name: train
num_bytes: 75282858.0
num_examples: 336
- name: validation
num_bytes: 33608419.0
num_examples: 150
download_size: 108887038
dataset_size: 108891277.0
configs:
- config_name: 10min
data_files:
- split: train
path: 10min/train-*
- split: validation
path: 10min/validation-*
- config_name: 20min
data_files:
- split: train
path: 20min/train-*
- split: validation
path: 20min/validation-*
- config_name: 30min
data_files:
- split: train
path: 30min/train-*
- split: validation
path: 30min/validation-*
---
# Dataset Card for "MotorizedTransportSplit5ClassFineTune"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mteb/banking77 | ---
language:
- en
--- |
sethapun/cv_svamp_augmented_fold3_ver2 | ---
dataset_info:
features:
- name: body
dtype: string
- name: ques
dtype: string
- name: question
dtype: string
- name: equation
dtype: string
- name: answer
dtype: float64
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 2745859
num_examples: 7946
- name: validation
num_bytes: 129746
num_examples: 330
download_size: 720562
dataset_size: 2875605
---
# Dataset Card for "cv_svamp_augmented_fold3_ver2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nimaster/autonlp-data-devign_raw_test | ---
languages:
- en
task_categories:
- text-classification
---
# AutoNLP Dataset for project: devign_raw_test
## Dataset Descritpion
This dataset has been automatically processed by AutoNLP for project devign_raw_test.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "void ff_avg_h264_qpel16_mc32_msa ( uint8_t * dst , const uint8_t * src , ptrdiff_t stride ) { avc_lu[...]",
"target": 0
},
{
"text": "static void sd_cardchange ( void * opaque , bool load ) { SDState * sd = opaque ; qemu_set_irq ( sd [...]",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(num_classes=2, names=['0', '1'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 21188 |
| valid | 5298 |
|
tyzhu/squad_qa_baseline_v5_full_recite_ans_sent_first_permute_rerun | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 2996506.0
num_examples: 2385
- name: validation
num_bytes: 395889
num_examples: 300
download_size: 842977
dataset_size: 3392395.0
---
# Dataset Card for "squad_qa_baseline_v5_full_recite_ans_sent_first_permute_rerun"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_budecosystem__code-millenials-34b | ---
pretty_name: Evaluation run of budecosystem/code-millenials-34b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [budecosystem/code-millenials-34b](https://huggingface.co/budecosystem/code-millenials-34b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_budecosystem__code-millenials-34b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T04:22:33.986521](https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__code-millenials-34b/blob/main/results_2024-01-05T04-22-33.986521.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.49379316954274166,\n\
\ \"acc_stderr\": 0.0347032593316836,\n \"acc_norm\": 0.4973157421712678,\n\
\ \"acc_norm_stderr\": 0.03543054806190878,\n \"mc1\": 0.32068543451652387,\n\
\ \"mc1_stderr\": 0.016339170373280906,\n \"mc2\": 0.45367169400803836,\n\
\ \"mc2_stderr\": 0.015502767323951004\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.46331058020477817,\n \"acc_stderr\": 0.014572000527756989,\n\
\ \"acc_norm\": 0.49829351535836175,\n \"acc_norm_stderr\": 0.01461130570505699\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5505875323640709,\n\
\ \"acc_stderr\": 0.0049641770352214214,\n \"acc_norm\": 0.7509460266879108,\n\
\ \"acc_norm_stderr\": 0.004315812968431589\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.040335656678483184,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.040335656678483184\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.041227287076512825,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.041227287076512825\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383889,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383889\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.535483870967742,\n\
\ \"acc_stderr\": 0.02837228779796294,\n \"acc_norm\": 0.535483870967742,\n\
\ \"acc_norm_stderr\": 0.02837228779796294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969565,\n\
\ \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969565\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.03851716319398393,\n\
\ \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.03851716319398393\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5858585858585859,\n \"acc_stderr\": 0.03509438348879629,\n \"\
acc_norm\": 0.5858585858585859,\n \"acc_norm_stderr\": 0.03509438348879629\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414356,\n\
\ \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414356\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.025285585990017838,\n\
\ \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.025285585990017838\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547307,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547307\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.44537815126050423,\n \"acc_stderr\": 0.0322841062671639,\n \
\ \"acc_norm\": 0.44537815126050423,\n \"acc_norm_stderr\": 0.0322841062671639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.636697247706422,\n \"acc_stderr\": 0.020620603919625804,\n \"\
acc_norm\": 0.636697247706422,\n \"acc_norm_stderr\": 0.020620603919625804\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6323529411764706,\n \"acc_stderr\": 0.03384132045674118,\n \"\
acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.03384132045674118\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.030685820596610798,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.030685820596610798\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5067264573991032,\n\
\ \"acc_stderr\": 0.03355476596234355,\n \"acc_norm\": 0.5067264573991032,\n\
\ \"acc_norm_stderr\": 0.03355476596234355\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.04320767807536671,\n \"\
acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.04320767807536671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6012269938650306,\n \"acc_stderr\": 0.038470214204560246,\n\
\ \"acc_norm\": 0.6012269938650306,\n \"acc_norm_stderr\": 0.038470214204560246\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326467,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326467\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7136752136752137,\n\
\ \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.7136752136752137,\n\
\ \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6475095785440613,\n\
\ \"acc_stderr\": 0.01708415024408138,\n \"acc_norm\": 0.6475095785440613,\n\
\ \"acc_norm_stderr\": 0.01708415024408138\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5491329479768786,\n \"acc_stderr\": 0.026788811931562753,\n\
\ \"acc_norm\": 0.5491329479768786,\n \"acc_norm_stderr\": 0.026788811931562753\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3675977653631285,\n\
\ \"acc_stderr\": 0.016125543823552968,\n \"acc_norm\": 0.3675977653631285,\n\
\ \"acc_norm_stderr\": 0.016125543823552968\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.028614624752805445,\n\
\ \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.028614624752805445\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5080385852090032,\n\
\ \"acc_stderr\": 0.02839442137098453,\n \"acc_norm\": 0.5080385852090032,\n\
\ \"acc_norm_stderr\": 0.02839442137098453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4876543209876543,\n \"acc_stderr\": 0.027812262269327235,\n\
\ \"acc_norm\": 0.4876543209876543,\n \"acc_norm_stderr\": 0.027812262269327235\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.31560283687943264,\n \"acc_stderr\": 0.027724989449509314,\n \
\ \"acc_norm\": 0.31560283687943264,\n \"acc_norm_stderr\": 0.027724989449509314\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3709256844850065,\n\
\ \"acc_stderr\": 0.012337391684530314,\n \"acc_norm\": 0.3709256844850065,\n\
\ \"acc_norm_stderr\": 0.012337391684530314\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39338235294117646,\n \"acc_stderr\": 0.029674288281311172,\n\
\ \"acc_norm\": 0.39338235294117646,\n \"acc_norm_stderr\": 0.029674288281311172\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.43300653594771243,\n \"acc_stderr\": 0.020045442473324224,\n \
\ \"acc_norm\": 0.43300653594771243,\n \"acc_norm_stderr\": 0.020045442473324224\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5836734693877551,\n \"acc_stderr\": 0.031557828165561644,\n\
\ \"acc_norm\": 0.5836734693877551,\n \"acc_norm_stderr\": 0.031557828165561644\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6616915422885572,\n\
\ \"acc_stderr\": 0.03345563070339192,\n \"acc_norm\": 0.6616915422885572,\n\
\ \"acc_norm_stderr\": 0.03345563070339192\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n\
\ \"acc_stderr\": 0.03753267402120574,\n \"acc_norm\": 0.3674698795180723,\n\
\ \"acc_norm_stderr\": 0.03753267402120574\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6549707602339181,\n \"acc_stderr\": 0.03645981377388806,\n\
\ \"acc_norm\": 0.6549707602339181,\n \"acc_norm_stderr\": 0.03645981377388806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32068543451652387,\n\
\ \"mc1_stderr\": 0.016339170373280906,\n \"mc2\": 0.45367169400803836,\n\
\ \"mc2_stderr\": 0.015502767323951004\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6906077348066298,\n \"acc_stderr\": 0.012991329330823002\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3244882486732373,\n \
\ \"acc_stderr\": 0.012896095359768111\n }\n}\n```"
repo_url: https://huggingface.co/budecosystem/code-millenials-34b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|arc:challenge|25_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|gsm8k|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hellaswag|10_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-22-33.986521.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T04-22-33.986521.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- '**/details_harness|winogrande|5_2024-01-05T04-22-33.986521.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T04-22-33.986521.parquet'
- config_name: results
data_files:
- split: 2024_01_05T04_22_33.986521
path:
- results_2024-01-05T04-22-33.986521.parquet
- split: latest
path:
- results_2024-01-05T04-22-33.986521.parquet
---
# Dataset Card for Evaluation run of budecosystem/code-millenials-34b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [budecosystem/code-millenials-34b](https://huggingface.co/budecosystem/code-millenials-34b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_budecosystem__code-millenials-34b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T04:22:33.986521](https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__code-millenials-34b/blob/main/results_2024-01-05T04-22-33.986521.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.49379316954274166,
"acc_stderr": 0.0347032593316836,
"acc_norm": 0.4973157421712678,
"acc_norm_stderr": 0.03543054806190878,
"mc1": 0.32068543451652387,
"mc1_stderr": 0.016339170373280906,
"mc2": 0.45367169400803836,
"mc2_stderr": 0.015502767323951004
},
"harness|arc:challenge|25": {
"acc": 0.46331058020477817,
"acc_stderr": 0.014572000527756989,
"acc_norm": 0.49829351535836175,
"acc_norm_stderr": 0.01461130570505699
},
"harness|hellaswag|10": {
"acc": 0.5505875323640709,
"acc_stderr": 0.0049641770352214214,
"acc_norm": 0.7509460266879108,
"acc_norm_stderr": 0.004315812968431589
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.040335656678483184,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.040335656678483184
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.041227287076512825,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.041227287076512825
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383889,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383889
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.025167982333894143,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.025167982333894143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.535483870967742,
"acc_stderr": 0.02837228779796294,
"acc_norm": 0.535483870967742,
"acc_norm_stderr": 0.02837228779796294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969565,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969565
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.03851716319398393,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.03851716319398393
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5858585858585859,
"acc_stderr": 0.03509438348879629,
"acc_norm": 0.5858585858585859,
"acc_norm_stderr": 0.03509438348879629
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.03447478286414356,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.03447478286414356
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4641025641025641,
"acc_stderr": 0.025285585990017838,
"acc_norm": 0.4641025641025641,
"acc_norm_stderr": 0.025285585990017838
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547307,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547307
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.44537815126050423,
"acc_stderr": 0.0322841062671639,
"acc_norm": 0.44537815126050423,
"acc_norm_stderr": 0.0322841062671639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.636697247706422,
"acc_stderr": 0.020620603919625804,
"acc_norm": 0.636697247706422,
"acc_norm_stderr": 0.020620603919625804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.03384132045674118,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.03384132045674118
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.030685820596610798,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.030685820596610798
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5067264573991032,
"acc_stderr": 0.03355476596234355,
"acc_norm": 0.5067264573991032,
"acc_norm_stderr": 0.03355476596234355
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.04320767807536671,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.04320767807536671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6012269938650306,
"acc_stderr": 0.038470214204560246,
"acc_norm": 0.6012269938650306,
"acc_norm_stderr": 0.038470214204560246
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326467,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326467
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7136752136752137,
"acc_stderr": 0.02961432369045665,
"acc_norm": 0.7136752136752137,
"acc_norm_stderr": 0.02961432369045665
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6475095785440613,
"acc_stderr": 0.01708415024408138,
"acc_norm": 0.6475095785440613,
"acc_norm_stderr": 0.01708415024408138
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.026788811931562753,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.026788811931562753
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3675977653631285,
"acc_stderr": 0.016125543823552968,
"acc_norm": 0.3675977653631285,
"acc_norm_stderr": 0.016125543823552968
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.028614624752805445,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.028614624752805445
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5080385852090032,
"acc_stderr": 0.02839442137098453,
"acc_norm": 0.5080385852090032,
"acc_norm_stderr": 0.02839442137098453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4876543209876543,
"acc_stderr": 0.027812262269327235,
"acc_norm": 0.4876543209876543,
"acc_norm_stderr": 0.027812262269327235
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.31560283687943264,
"acc_stderr": 0.027724989449509314,
"acc_norm": 0.31560283687943264,
"acc_norm_stderr": 0.027724989449509314
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3709256844850065,
"acc_stderr": 0.012337391684530314,
"acc_norm": 0.3709256844850065,
"acc_norm_stderr": 0.012337391684530314
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39338235294117646,
"acc_stderr": 0.029674288281311172,
"acc_norm": 0.39338235294117646,
"acc_norm_stderr": 0.029674288281311172
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.43300653594771243,
"acc_stderr": 0.020045442473324224,
"acc_norm": 0.43300653594771243,
"acc_norm_stderr": 0.020045442473324224
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5836734693877551,
"acc_stderr": 0.031557828165561644,
"acc_norm": 0.5836734693877551,
"acc_norm_stderr": 0.031557828165561644
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6616915422885572,
"acc_stderr": 0.03345563070339192,
"acc_norm": 0.6616915422885572,
"acc_norm_stderr": 0.03345563070339192
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120574,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120574
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6549707602339181,
"acc_stderr": 0.03645981377388806,
"acc_norm": 0.6549707602339181,
"acc_norm_stderr": 0.03645981377388806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32068543451652387,
"mc1_stderr": 0.016339170373280906,
"mc2": 0.45367169400803836,
"mc2_stderr": 0.015502767323951004
},
"harness|winogrande|5": {
"acc": 0.6906077348066298,
"acc_stderr": 0.012991329330823002
},
"harness|gsm8k|5": {
"acc": 0.3244882486732373,
"acc_stderr": 0.012896095359768111
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
umesh16071973/HRMS_FILTER_DATA | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_cola_one_relativizer | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 64603
num_examples: 870
- name: test
num_bytes: 64656
num_examples: 893
- name: train
num_bytes: 536468
num_examples: 7421
download_size: 294029
dataset_size: 665727
---
# Dataset Card for "MULTI_VALUE_cola_one_relativizer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
taesiri/hateful-meme-captions | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: string
- name: text
dtype: string
- name: split
dtype: string
- name: label
dtype: int64
- name: blip2-opt-6.7b_captions.csv
dtype: string
- name: coca_captions.csv
dtype: string
- name: git-large-coco_captions.csv
dtype: string
- name: git-large-r-textcaps_captions.csv
dtype: string
- name: vit-gpt2_captions.csv
dtype: string
splits:
- name: validation
num_bytes: 4008680
num_examples: 10000
download_size: 2082131
dataset_size: 4008680
---
# Dataset Card for "hateful-meme-captions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Definite/my_precious | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': jms
'1': shin
'2': winner
'3': biblei
splits:
- name: train
num_bytes: 2691826.25
num_examples: 5000
- name: eval
num_bytes: 1615095.75
num_examples: 3000
- name: test
num_bytes: 1115019
num_examples: 2000
download_size: 3023909
dataset_size: 5421941.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
- split: test
path: data/test-*
---
|
mcgillcomplex/wikipedia-2023-11-bge-large-en-v1.5 | ---
language:
- en
configs:
- config_name: en
data_files:
- split: train
path: en/*
---
# Multilingual Embeddings for Wikipedia
This dataset contains the [wikimedia/wikipedia](https://huggingface.co/datasets/wikimedia/wikipedia) dataset dump from 2023-11-01 from Wikipedia in all 300+ languages.
And chunked from the [Cohere/wikipedia-2023-11-embed-multilingual-v3](https://huggingface.co/datasets/Cohere/wikipedia-2023-11-embed-multilingual-v3).
The embedding model is [BAAI/bge-large-en-v1.5](https://huggingface.co/BAAI/bge-large-en-v1.5). |
Hoseindb/a-private-dataset | ---
license: cc-by-nc-nd-3.0
---
|
Chewan/autotrain-data-chewan | ---
language:
- en
task_categories:
- text-classification
---
# AutoTrain Dataset for project: chewan
## Dataset Description
This dataset has been automatically processed by AutoTrain for project chewan.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "One of the other reviewers has mentioned that after watching just 1 Oz episode you'll be hooked. They are right, as this is exactly what happened with me.<br /><br />The first thing that struck me about Oz was its brutality and unflinching scenes of violence, which set in right from the word GO. Trust me, this is not a show for the faint hearted or timid. This show pulls no punches with regards to drugs, sex or violence. Its is hardcore, in the classic use of the word.<br /><br />It is called OZ as that is the nickname given to the Oswald Maximum Security State Penitentary. It focuses mainly on Emerald City, an experimental section of the prison where all the cells have glass fronts and face inwards, so privacy is not high on the agenda. Em City is home to many..Aryans, Muslims, gangstas, Latinos, Christians, Italians, Irish and more....so scuffles, death stares, dodgy dealings and shady agreements are never far away.<br /><br />I would say the main appeal of the show is due to the fact that it goes where other shows wouldn't dare. Forget pretty pictures painted for mainstream audiences, forget charm, forget romance...OZ doesn't mess around. The first episode I ever saw struck me as so nasty it was surreal, I couldn't say I was ready for it, but as I watched more, I developed a taste for Oz, and got accustomed to the high levels of graphic violence. Not just violence, but injustice (crooked guards who'll be sold out for a nickel, inmates who'll kill on order and get away with it, well mannered, middle class inmates being turned into prison bitches due to their lack of street skills or prison experience) Watching Oz, you may become comfortable with what is uncomfortable viewing....thats if you can get in touch with your darker side.",
"target": 1
},
{
"text": "A wonderful little production. <br /><br />The filming technique is very unassuming- very old-time-BBC fashion and gives a comforting, and sometimes discomforting, sense of realism to the entire piece. <br /><br />The actors are extremely well chosen- Michael Sheen not only \"has got all the polari\" but he has all the voices down pat too! You can truly see the seamless editing guided by the references to Williams' diary entries, not only is it well worth the watching but it is a terrificly written and performed piece. A masterful production about one of the great master's of comedy and his life. <br /><br />The realism really comes home with the little things: the fantasy of the guard which, rather than use the traditional 'dream' techniques remains solid then disappears. It plays on our knowledge and our senses, particularly with the scenes concerning Orton and Halliwell and the sets (particularly of their flat with Halliwell's murals decorating every surface) are terribly well done.",
"target": 1
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=['negative', 'positive'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 1471 |
| valid | 602 |
|
Hack90/ncbi_genbank_part_50 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
splits:
- name: train
num_bytes: 26640706946
num_examples: 3973282
download_size: 10256237433
dataset_size: 26640706946
---
# Dataset Card for "ncbi_genbank_part_50"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ecnu-aigc/EMID | ---
language:
- en
license: cc-by-nc-sa-4.0
size_categories:
- 10K<n<100K
tags:
- music
- emotion
- image
- cross_modal
dataset_info:
features:
- name: Audio_Filename
dtype:
audio:
mono: false
- name: genre
dtype: string
- name: feeling
dtype: string
- name: emotion
dtype: string
- name: Image1_filename
dtype: image
- name: Image1_tag
dtype: string
- name: Image1_text
dtype: string
- name: Image2_filename
dtype: image
- name: Image2_tag
dtype: string
- name: Image2_text
dtype: string
- name: Image3_filename
dtype: image
- name: Image3_tag
dtype: string
- name: Image3_text
dtype: string
- name: is_original_clip
dtype: bool
splits:
- name: train
num_bytes: 3966276607.886
num_examples: 10738
download_size: 372817257
dataset_size: 3966276607.886
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
## Dataset Summary
**E**motionally paired **M**usic and **I**mage Dataset (**EMID**) is a novel dataset designed for the emotional matching of music and images. The EMID dataset contains 10,738 unique music clips, each of which is paired with 3 images in the same emotional category,as well as rich annotations. These musical clips are categorized into the 13 emotional categories proposed by [What music makes us feel: At least 13 dimensions organize subjective experiences associated with music across different cultures](https://pnas.org/doi/full/10.1073/pnas.1910704117), from which we obtain 1,836 original music clips. Subsequently, we acquire images labeled with Mikels' eight emotions from [Building a Large Scale Dataset for Image Emotion Recognition: The Fine Print and The Benchmark](http://arxiv.org/abs/1605.02677). Through the processing pipeline proposed in our paper, we expanded the original music clips and obtain the final dataset.
Please cite our paper when using the EMID dataset: https://arxiv.org/abs/2308.07622
### Processing Pipeline

### Statistics of EMID
| Emotion | A | B | C | D | E | F | G | H | I | J | K | L | M | Total |
| ---------------- | :--: | :--: | :--: | :--: | :--: | :--: | :--: | :--: | :--: | :--: | :--: | :--: | :--: | :---: |
| Before expanding | 45 | 80 | 54 | 131 | 306 | 174 | 367 | 86 | 36 | 124 | 129 | 105 | 199 | 1836 |
| After expanding | 255 | 545 | 320 | 771 | 1531 | 889 | 1832 | 1036 | 323 | 1014 | 799 | 484 | 939 | 10738 |
## Dataset Download
The published dataset takes the form of a `.csv` file that contains the filenames of music clips and corresponding emotional matched images. In order to use this dataset, your should download the csv file as well as the music/image data.
This dataset licensed under a CC BY-NC-SA 4.0 lisence, which provides free access to the data for non-profit work. We are by no means can guaranty support for its users. This database comes as is with no guaranty of correctness and we are not liable to any damage it might cause.
## Data Fields
| filed_name | explanation | example |
| ---------------- | :----------------------------------------------------------- | :----------------------------------------------------------- |
| Audio_Filename | unique Filename of the music clip | 106.m4a___172.mp3 |
| genre | Letter A to M, representing one of the 13 emotional category | K |
| feeling | Feelings reported by participants after listening to this music clip and their proportions | "33% Sad, depressing, 22% Awe-inspiring, amazing, 22% Proud, strong, 22% Triumphant, heroic, 19% Dreamy, 15% Beautiful, 15% Bittersweet, 11% Calm, relaxing, serene, 11% Compassionate, sympathetic, 11% Entrancing, 11% Transcendent, mystical, 7% Eerie, mysterious, 7% Painful, 7% Tender, longing, 4% Energizing, pump-up, 4% Indignant, defiant" |
| emotion | Ratings of subjective experiences elicited by the music on 11 emotional dimensions, ranging from 1 to 9 | "5,5.3,5,6,3.1,6.1,5.1,3.6,6.2,5.9,5.6" |
| Image1_filename | - | excitement_0616.jpg |
| Image1_tag | emotional category of image 1 | excitement |
| Image1_text | The textual description of Image 1 generated by the GIT model | ['the marching band in the parade'] |
| Image2_filename | - | amusement_2906.jpg |
| Image2_tag | emotional category of image 2 | amusement |
| Image2_text | The textual description of Image 2 generated by the GIT model | ['the marching band at disneyland'] |
| Image3_filename | - | amusement_2226.jpg |
| Image3_tag | emotional category of image 3 | amusement |
| Image3_text | The textual description of Image 3 generated by the GIT model | ['a marching band in a parade with people watching.'] |
| is_original_clip | If this value is true, the music clip is from the original music dataset, otherwise it is expanded from the original music clip through our processing pipeline. The original music clips are considered to provide a better emotional matching performance | False |
For more information about feeling and emotion filed, you can refer to [What music makes us feel: At least 13 dimensions organize subjective experiences associated with music across different cultures](https://pnas.org/doi/full/10.1073/pnas.1910704117) and their visualized interactive [site](https://www.ocf.berkeley.edu/~acowen/music.html#). |
kings-crown/Aircraft_summary | ---
license: mit
---
|
arnmig/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
dtype: string
- name: labels
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
dtype: string
- name: assignees
dtype: string
- name: milestone
dtype: string
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: string
- name: author_association
dtype: string
- name: active_lock_reason
dtype: string
- name: draft
dtype: string
- name: pull_request
dtype: string
- name: body
dtype: string
- name: reactions
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: string
- name: state_reason
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 32536203
num_examples: 6214
download_size: 8102507
dataset_size: 32536203
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Isotonic/planner_dataset | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
- name: text
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 33288024.48829969
num_examples: 31349
- name: test
num_bytes: 8322802.511700309
num_examples: 7838
download_size: 15269179
dataset_size: 41610827.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
projecte-aina/teca | ---
YAML tags:
annotations_creators:
- expert-generated
language_creators:
- found
language:
- ca
license:
- cc-by-nc-nd-4.0
multilinguality:
- monolingual
pretty_name: teca
size_categories:
- unknown
source_datasets: []
task_categories:
- text-classification
task_ids:
- natural-language-inference
---
# Dataset Card for TE-ca
## Dataset Description
- **Website:** https://zenodo.org/record/4761458
- **Repository** [HuggingFace](https://huggingface.co/projecte-aina)
- **Paper:** [Are Multilingual Models the Best Choice for Moderately Under-resourced Languages? A Comprehensive Assessment for Catalan](https://arxiv.org/abs/2107.07903)
- **Point of Contact:** [Language Technologies Unit](langtech@bsc.es)
### Dataset Summary
TE-ca is a dataset of textual entailment in Catalan, which contains 21,163 pairs of premises and hypotheses, annotated according to the inference relation they have (implication, contradiction or neutral).
This dataset was developed by [BSC TeMU](https://temu.bsc.es/) as part of [Projecte AINA](https://politiquesdigitals.gencat.cat/ca/economia/catalonia-ai/aina/), to enrich the [Catalan Language Understanding Benchmark (CLUB)](https://club.aina.bsc.es/).
This work is licensed under an <a rel="license" href="https://creativecommons.org/licenses/by-nc-nd/4.0/">Attribution-NonCommercial-NoDerivatives 4.0 International License</a>.
### Supported Tasks and Leaderboards
Textual entailment, Text classification, Language Model
### Languages
The dataset is in Catalan (`ca-ES`).
## Dataset Structure
### Data Instances
Three JSON files, one for each split.
### Example:
<pre>
{
"id": 3247,
"premise": "L'ONU adopta a Marràqueix un pacte no vinculant per les migracions",
"hypothesis": "S'acorden unes recomanacions per les persones migrades a Marràqueix",
"label": "0"
},
{
"id": 2825,
"premise": "L'ONU adopta a Marràqueix un pacte no vinculant per les migracions",
"hypothesis": "Les persones migrades seran acollides a Marràqueix",
"label": "1"
},
{
"id": 2431,
"premise": "L'ONU adopta a Marràqueix un pacte no vinculant per les migracions",
"hypothesis": "L'acord impulsat per l'ONU lluny de tancar-se",
"label": "2"
},
</pre>
### Data Fields
- premise: text
- hypothesis: text related to the premise
- label: relation between premise and hypothesis:
* 0: entailment
* 1: neutral
* 2: contradiction
### Data Splits
* dev.json: 2116 examples
* test.json: 2117 examples
* train.json: 16930 examples
## Dataset Creation
### Curation Rationale
We created this dataset to contribute to the development of language models in Catalan, a low-resource language.
### Source Data
Source sentences are extracted from the [Catalan Textual Corpus](https://doi.org/10.5281/zenodo.4519349) and from [VilaWeb](https://www.vilaweb.cat) newswire.
#### Initial Data Collection and Normalization
12000 sentences from the BSC [Catalan Textual Corpus](https://doi.org/10.5281/zenodo.4519349), together with 6200 headers from the Catalan news site [VilaWeb](https://www.vilaweb.cat), were chosen randomly. We filtered them by different criteria, such as length and stand-alone intelligibility. For each selected text, we commissioned 3 hypotheses (one for each entailment category) to be written by a team of native annotators.
Some sentence pairs were excluded because of inconsistencies.
#### Who are the source language producers?
The Catalan Textual Corpus corpus consists of several corpora gathered from web crawling and public corpora. More information can be found [here](https://doi.org/10.5281/zenodo.4519349).
[VilaWeb](https://www.vilaweb.cat) is a Catalan newswire.
### Annotations
#### Annotation process
We commissioned 3 hypotheses (one for each entailment category) to be written by a team of annotators.
#### Who are the annotators?
Annotators are a team of native language collaborators from two independent companies.
### Personal and Sensitive Information
No personal or sensitive information included.
## Considerations for Using the Data
### Social Impact of Dataset
We hope this dataset contributes to the development of language models in Catalan, a low-resource language.
### Discussion of Biases
[N/A]
### Other Known Limitations
[N/A]
## Additional Information
### Dataset Curators
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@bsc.es)
This work was funded by the [Departament de la Vicepresidència i de Polítiques Digitals i Territori de la Generalitat de Catalunya](https://politiquesdigitals.gencat.cat/ca/inici/index.html#googtrans(ca|en) within the framework of [Projecte AINA](https://politiquesdigitals.gencat.cat/ca/economia/catalonia-ai/aina).
### Licensing Information
This work is licensed under an <a rel="license" href="https://creativecommons.org/licenses/by-nc-nd/4.0/">Attribution-NonCommercial-NoDerivatives 4.0 International License</a>.
### Citation Information
```
@inproceedings{armengol-estape-etal-2021-multilingual,
title = "Are Multilingual Models the Best Choice for Moderately Under-resourced Languages? {A} Comprehensive Assessment for {C}atalan",
author = "Armengol-Estap{\'e}, Jordi and
Carrino, Casimiro Pio and
Rodriguez-Penagos, Carlos and
de Gibert Bonet, Ona and
Armentano-Oller, Carme and
Gonzalez-Agirre, Aitor and
Melero, Maite and
Villegas, Marta",
booktitle = "Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.findings-acl.437",
doi = "10.18653/v1/2021.findings-acl.437",
pages = "4933--4946",
}
```
[DOI](https://doi.org/10.5281/zenodo.4529183)
|
Kunling/layoutlm_resume_data | ---
license: bsd
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.