datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
bigbio/bionlp_st_2013_pc |
---
language:
- en
bigbio_language:
- English
license: other
multilinguality: monolingual
bigbio_license_shortname: GENIA_PROJECT_LICENSE
pretty_name: BioNLP 2013 PC
homepage: https://github.com/openbiocorpora/bionlp-st-2013-pc
bigbio_pubmed: True
bigbio_public: True
bigbio_tasks:
- EVENT_EXTRACTION
- NAMED_ENTITY_RECOGNITION
- COREFERENCE_RESOLUTION
---
# Dataset Card for BioNLP 2013 PC
## Dataset Description
- **Homepage:** https://github.com/openbiocorpora/bionlp-st-2013-pc
- **Pubmed:** True
- **Public:** True
- **Tasks:** EE,NER,COREF
the Pathway Curation (PC) task is a main event extraction task of the BioNLP shared task (ST) 2013.
The PC task concerns the automatic extraction of biomolecular reactions from text.
The task setting, representation and semantics are defined with respect to pathway
model standards and ontologies (SBML, BioPAX, SBO) and documents selected by relevance
to specific model reactions. Two BioNLP ST 2013 participants successfully completed
the PC task. The highest achieved F-score, 52.8%, indicates that event extraction is
a promising approach to supporting pathway curation efforts.
## Citation Information
```
@inproceedings{ohta-etal-2013-overview,
title = "Overview of the Pathway Curation ({PC}) task of {B}io{NLP} Shared Task 2013",
author = "Ohta, Tomoko and
Pyysalo, Sampo and
Rak, Rafal and
Rowley, Andrew and
Chun, Hong-Woo and
Jung, Sung-Jae and
Choi, Sung-Pil and
Ananiadou, Sophia and
Tsujii, Jun{'}ichi",
booktitle = "Proceedings of the {B}io{NLP} Shared Task 2013 Workshop",
month = aug,
year = "2013",
address = "Sofia, Bulgaria",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/W13-2009",
pages = "67--75",
}
```
|
TokenBender/Tamil_chat_dataset | ---
license: apache-2.0
language:
- ta
- en
--- |
huggingface/autotrain-data-imgstg1 | Invalid username or password. |
wentingzhao/knn-prompt-datastore | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2360312955
num_examples: 2934591
download_size: 1352870614
dataset_size: 2360312955
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Flmc/DISC-Med-SFT | ---
license: apache-2.0
task_categories:
- question-answering
- conversational
language:
- zh
tags:
- medical
size_categories:
- 100K<n<1M
---
This is a repository containing a subset of the DISC-Med-SFT Dataset.
Check [DISC-MedLLM](https://github.com/FudanDISC/DISC-MedLLM) for more information. |
pking/SMG-NFT | ---
license: cc-by-nc-sa-4.0
annotations_creators:
- machine-generated
language:
- en
language_creators:
- other
multilinguality:
- monolingual
pretty_name: 'SMG-NFT'
size_categories:
- n<1K
source_datasets:
-
tags: []
task_categories:
- text-to-image
task_ids: []
---
# Dataset Card for SMG-NFT
## Examples
## Citation
|
heliosprime/twitter_dataset_1713223714 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 20869
num_examples: 60
download_size: 19515
dataset_size: 20869
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713223714"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
usvsnsp/generation-num-duplicates | ---
dataset_info:
features:
- name: sequence_id
dtype: uint32
- name: counts
dtype: uint32
splits:
- name: duped
num_bytes: 1171456000
num_examples: 146432000
- name: deduped
num_bytes: 1171456000
num_examples: 146432000
download_size: 1915148851
dataset_size: 2342912000
configs:
- config_name: default
data_files:
- split: duped
path: data/duped-*
- split: deduped
path: data/deduped-*
---
|
ShoukanLabs/OpenNiji-Dataset | ---
task_categories:
- text-to-image
language:
- en
- ja
- ko
tags:
- anime
- dataset
- Nijijourney
- Midjourney
- discord
size_categories:
- 100K<n<1M
license: cc-by-nc-4.0
---
# NOTE:
Recently Discord has added link expiry and tracking for their CDN content, however, this is for CDN attachments outside of Discord, now due to the nature of how this was scraped (being directly from the API) We're uncertain as to whether URL decay will start to become a problem. We have already created versions of the dataset in splits to combat this, we are well aware that this may not be an option for some and we apologise. |
Enkhmanlai/khanbank | ---
license: mit
---
|
parksez/superalloy2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 142049
num_examples: 310
download_size: 19540
dataset_size: 142049
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "superalloy2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ProtoEWAY/NEWDATASET | ---
license: unknown
---
|
jinaai/flores_clustering | ---
dataset_info:
features:
- name: sentences
sequence: string
- name: labels
sequence: string
splits:
- name: test
num_bytes: 249084
num_examples: 1
download_size: 154328
dataset_size: 249084
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
license: cc-by-sa-4.0
---
Data was derived from https://huggingface.co/datasets/facebook/flores
We normalized topics by 1. making them lowercase and 2. removing subcategories ('travel, expenses' -> 'travel'). Afterwards, we dropped every category that contained less than 15 sentences.
The Flores-200 dataset is hosted by the Facebook and licensed under the [Creative Commons Attribution-ShareAlike 4.0 International License](https://creativecommons.org/licenses/by-sa/4.0/). |
joey234/mmlu-high_school_physics-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 92715
num_examples: 151
download_size: 51970
dataset_size: 92715
---
# Dataset Card for "mmlu-high_school_physics-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gaizerick/falasvayne | ---
license: openrail
---
|
aymen414/Comic | ---
license: apache-2.0
---
|
jayelm/natural-instructions | ---
annotations_creators:
- crowdsourced
- expert-generated
language:
- en
multilinguality:
- monolingual
size_categories:
- 100M<n<1B
task_categories:
- other
---
Preprocessed version of Super-Natural-Instructions from https://github.com/allenai/natural-instructions/tree/master/splits. The same inputs may appear with different outputs, thus to avoid duplicate inputs, you can deduplicate by the `id` or the `inputs` field.
This is modified from https://huggingface.co/datasets/Muennighoff/natural-instructions
with a few improvements:
1. Adds positive/negative examples, outputs, explanations for each task, to
support different task definitions.
2. Adds an "eval" field which which is True for the first 100 examples of each
test task (119 * 100 = 11900 examples). This field indicates whether an example
is part of the abbreviated + balanced test split. See
https://github.com/allenai/natural-instructions/blob/master/src/reorder_instances_for_testing.py.
3. Adds an "eval" field to the training dataset, which can be used as an
in-domain evaluation set. To do so, we sample a balanced set the first 15
examples of each train split (757 * 15 = 11355 examples) and mark the "eval"
field as true.
|
CyberHarem/regensburg_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of regensburg/レーゲンスブルク/雷根斯堡 (Azur Lane)
This is the dataset of regensburg/レーゲンスブルク/雷根斯堡 (Azur Lane), containing 94 images and their tags.
The core tags of this character are `long_hair, breasts, yellow_eyes, horns, large_breasts, blue_hair, twintails, bangs, tail, pointy_ears, eyewear_on_head, sunglasses`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 94 | 177.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/regensburg_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 94 | 85.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/regensburg_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 258 | 198.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/regensburg_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 94 | 149.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/regensburg_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 258 | 295.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/regensburg_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/regensburg_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 27 |  |  |  |  |  | 1girl, solo, looking_at_viewer, bodysuit, wings, smile, bodystocking, demon_girl, skin_tight, cleavage, demon_horns, slit_pupils, dragon_girl, thighhighs |
| 1 | 6 |  |  |  |  |  | 1girl, black_one-piece_swimsuit, blush, choker, looking_at_viewer, solo, blue_sky, day, navel, outdoors, slingshot_swimsuit, bare_shoulders, bracelet, cleavage, smile, thighs, water |
| 2 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, navel, solo, barefoot, smile, bare_shoulders, closed_mouth, red_nails, spread_legs, stomach, beach, black_one-piece_swimsuit, blush, bracelet, cleavage, day, outdoors, slingshot_swimsuit, wet, ass_visible_through_thighs, black_choker, blue_sky, criss-cross_halter, kneeling, ocean, toenail_polish, demon_horns, full_body, cloud, demon_tail |
| 3 | 9 |  |  |  |  |  | 1girl, blush, hetero, solo_focus, 1boy, nipples, penis, sweat, black_bikini, open_mouth, bar_censor, choker, navel, spread_legs, collar, huge_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | bodysuit | wings | smile | bodystocking | demon_girl | skin_tight | cleavage | demon_horns | slit_pupils | dragon_girl | thighhighs | black_one-piece_swimsuit | blush | choker | blue_sky | day | navel | outdoors | slingshot_swimsuit | bare_shoulders | bracelet | thighs | water | barefoot | closed_mouth | red_nails | spread_legs | stomach | beach | wet | ass_visible_through_thighs | black_choker | criss-cross_halter | kneeling | ocean | toenail_polish | full_body | cloud | demon_tail | hetero | solo_focus | 1boy | nipples | penis | sweat | black_bikini | open_mouth | bar_censor | collar | huge_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-----------|:--------|:--------|:---------------|:-------------|:-------------|:-----------|:--------------|:--------------|:--------------|:-------------|:---------------------------|:--------|:---------|:-----------|:------|:--------|:-----------|:---------------------|:-----------------|:-----------|:---------|:--------|:-----------|:---------------|:------------|:--------------|:----------|:--------|:------|:-----------------------------|:---------------|:---------------------|:-----------|:--------|:-----------------|:------------|:--------|:-------------|:---------|:-------------|:-------|:----------|:--------|:--------|:---------------|:-------------|:-------------|:---------|:---------------|
| 0 | 27 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | | | X | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | X | X | | | X | | | | X | X | | | | X | X | | X | X | X | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | | | | | | | | | | | | | | | X | X | | | X | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
svenschultze/artificial-vs-natural | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: margin
dtype: int64
splits:
- name: train
num_bytes: 26691782.61397505
num_examples: 8153
- name: test
num_bytes: 2966117.3860249477
num_examples: 906
download_size: 15568743
dataset_size: 29657900.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
nithin1995/dfuc_sroie_image | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 560563801.0
num_examples: 973
download_size: 499264712
dataset_size: 560563801.0
---
# Dataset Card for "dfuc_sroie_image"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davidggphy/voxpopuli_nl_validation | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence:
sequence: float32
- name: speaker_embeddings
sequence: float32
splits:
- name: train
num_bytes: 181816747.2
num_examples: 1107
- name: test
num_bytes: 20201860.8
num_examples: 123
download_size: 201927043
dataset_size: 202018608.0
---
# Dataset Card for "voxpopuli_nl_validation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chansurgeplus/mt_bench_gpt4_single_pairs_judgments | ---
dataset_info:
features:
- name: question_id
dtype: int64
- name: model_a
dtype: string
- name: model_b
dtype: string
- name: winner
dtype: string
- name: judge
dtype: string
- name: conversation_a
list:
- name: content
dtype: string
- name: role
dtype: string
- name: conversation_b
list:
- name: content
dtype: string
- name: role
dtype: string
- name: turn
dtype: int64
splits:
- name: train
num_bytes: 427693685
num_examples: 89760
download_size: 39697828
dataset_size: 427693685
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nexdata/Chinese_Speaking_English_Speech_Data_by_Mobile_phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Chinese_Speaking_English_Speech_Data_by_Mobile_phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/32?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset is 100,000 colloquial English sentences recorded by 3,691 Chinese, covering many domestic dialect zones like Jiangsu, Shandong, Beijing, Henan, and meets the specific accent of Chinese speaking English. The recording texts contain commonly used sentences with rich contents, broad fields, and balanced phoneme. It can be used in improving the recognition effect of the speech recognition system on Chinese speaking English.
For more details, please refer to the link: https://www.nexdata.ai/datasets/32?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Chinese English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
skprime11/dataset | ---
license: mit
---
|
armanzarei/celebhq_canny_conditioned | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': female
'1': male
- name: canny
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 2874338816.0
num_examples: 28000
download_size: 2876727551
dataset_size: 2874338816.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dracoglacius/timit | ---
license: mit
---
|
jiovine/pixel-art-nouns | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 364572580.625
num_examples: 49859
download_size: 328291373
dataset_size: 364572580.625
---
# Dataset Card for "pixel-art-nouns"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
khoomeik/gzipscale-code-C-2.6M | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 10387940
num_examples: 10105
download_size: 2682329
dataset_size: 10387940
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_NurtureAI__Hermes-2-Pro-Mistral-7B | ---
pretty_name: Evaluation run of NurtureAI/Hermes-2-Pro-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NurtureAI/Hermes-2-Pro-Mistral-7B](https://huggingface.co/NurtureAI/Hermes-2-Pro-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NurtureAI__Hermes-2-Pro-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T16:01:06.227893](https://huggingface.co/datasets/open-llm-leaderboard/details_NurtureAI__Hermes-2-Pro-Mistral-7B/blob/main/results_2024-04-15T16-01-06.227893.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.624271272052464,\n\
\ \"acc_stderr\": 0.03255341217390494,\n \"acc_norm\": 0.6258718188832934,\n\
\ \"acc_norm_stderr\": 0.033203643219554525,\n \"mc1\": 0.41982864137086906,\n\
\ \"mc1_stderr\": 0.01727703030177577,\n \"mc2\": 0.5899114428497659,\n\
\ \"mc2_stderr\": 0.015856288399141282\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6143344709897611,\n \"acc_stderr\": 0.014224250973257184,\n\
\ \"acc_norm\": 0.6416382252559727,\n \"acc_norm_stderr\": 0.014012883334859859\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6483768173670583,\n\
\ \"acc_stderr\": 0.004765012078929386,\n \"acc_norm\": 0.8273252340171281,\n\
\ \"acc_norm_stderr\": 0.003771934042799157\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432104,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432104\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n\
\ \"acc_stderr\": 0.024993053397764805,\n \"acc_norm\": 0.7387096774193549,\n\
\ \"acc_norm_stderr\": 0.024993053397764805\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.024537591572830503,\n\
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.024537591572830503\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010344,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010344\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.013890862162876166,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.013890862162876166\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2547486033519553,\n\
\ \"acc_stderr\": 0.01457265038340916,\n \"acc_norm\": 0.2547486033519553,\n\
\ \"acc_norm_stderr\": 0.01457265038340916\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729487,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729487\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.46479791395045633,\n \"acc_stderr\": 0.012738547371303956,\n\
\ \"acc_norm\": 0.46479791395045633,\n \"acc_norm_stderr\": 0.012738547371303956\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"\
acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6323529411764706,\n \"acc_stderr\": 0.019506291693954843,\n \
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.019506291693954843\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786838,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786838\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41982864137086906,\n\
\ \"mc1_stderr\": 0.01727703030177577,\n \"mc2\": 0.5899114428497659,\n\
\ \"mc2_stderr\": 0.015856288399141282\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7561168113654302,\n \"acc_stderr\": 0.012068923278908185\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.604245640636846,\n \
\ \"acc_stderr\": 0.013469823701048815\n }\n}\n```"
repo_url: https://huggingface.co/NurtureAI/Hermes-2-Pro-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|arc:challenge|25_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|gsm8k|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hellaswag|10_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-01-06.227893.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T16-01-06.227893.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- '**/details_harness|winogrande|5_2024-04-15T16-01-06.227893.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T16-01-06.227893.parquet'
- config_name: results
data_files:
- split: 2024_04_15T16_01_06.227893
path:
- results_2024-04-15T16-01-06.227893.parquet
- split: latest
path:
- results_2024-04-15T16-01-06.227893.parquet
---
# Dataset Card for Evaluation run of NurtureAI/Hermes-2-Pro-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NurtureAI/Hermes-2-Pro-Mistral-7B](https://huggingface.co/NurtureAI/Hermes-2-Pro-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NurtureAI__Hermes-2-Pro-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T16:01:06.227893](https://huggingface.co/datasets/open-llm-leaderboard/details_NurtureAI__Hermes-2-Pro-Mistral-7B/blob/main/results_2024-04-15T16-01-06.227893.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.624271272052464,
"acc_stderr": 0.03255341217390494,
"acc_norm": 0.6258718188832934,
"acc_norm_stderr": 0.033203643219554525,
"mc1": 0.41982864137086906,
"mc1_stderr": 0.01727703030177577,
"mc2": 0.5899114428497659,
"mc2_stderr": 0.015856288399141282
},
"harness|arc:challenge|25": {
"acc": 0.6143344709897611,
"acc_stderr": 0.014224250973257184,
"acc_norm": 0.6416382252559727,
"acc_norm_stderr": 0.014012883334859859
},
"harness|hellaswag|10": {
"acc": 0.6483768173670583,
"acc_stderr": 0.004765012078929386,
"acc_norm": 0.8273252340171281,
"acc_norm_stderr": 0.003771934042799157
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432104,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432104
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764805,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764805
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139403,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139403
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.024537591572830503,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.024537591572830503
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010344,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010344
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290902,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290902
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876166,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876166
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2547486033519553,
"acc_stderr": 0.01457265038340916,
"acc_norm": 0.2547486033519553,
"acc_norm_stderr": 0.01457265038340916
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729487,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729487
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303956,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303956
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.019506291693954843,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.019506291693954843
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786838,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786838
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41982864137086906,
"mc1_stderr": 0.01727703030177577,
"mc2": 0.5899114428497659,
"mc2_stderr": 0.015856288399141282
},
"harness|winogrande|5": {
"acc": 0.7561168113654302,
"acc_stderr": 0.012068923278908185
},
"harness|gsm8k|5": {
"acc": 0.604245640636846,
"acc_stderr": 0.013469823701048815
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
imvladikon/nemo_corpus | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- he
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|other-reuters-corpus
task_categories:
- token-classification
task_ids:
- named-entity-recognition
train-eval-index:
- config: nemo_corpus
task: token-classification
task_id: entity_extraction
splits:
train_split: train
eval_split: validation
test_split: test
col_mapping:
tokens: tokens
ner_tags: tags
metrics:
- type: seqeval
name: seqeval
---
# NEMO-Corpus - The Hebrew Named Entities and Morphology Corpus
## Config and Usage
Config:
* flat_token - flatten tags
* nested_token - nested tags
* flat_morph - flatten tags with morphologically presegmentized tokens
* nested_morph - nested tags with morphologically presegmentized tokens
Note: It seems that a couple of samples for the flat_token and nested_token are mistakenly presegmented, and as a result, these samples have white space in the token.
```python
from datasets import load_dataset
# the main corpus
ds = load_dataset('imvladikon/nemo_corpus', "flat_token")
for sample in ds["train"]:
print(sample)
# the nested corpus
ds = load_dataset('imvladikon/nemo_corpus', "nested_morph")
```
Getting classes and encoding/decoding could be done through these functions:
```
idx2label = dataset["train"].features["ner_tags"].feature.int2str
label2idx = dataset["train"].features["ner_tags"].feature.str2int
```
or just use raw_tags field.
## Fields
available fields (flat):
* "id"
* "sentence"
* "tokens"
* "raw_tags"
* "ner_tags"
Example of the one record for `flat`:
```json
{'id': '0', 'tokens': ['"', 'תהיה', 'נקמה', 'ו', 'בגדול', '.'], 'sentence': '" תהיה נקמה ו בגדול .', 'raw_tags': ['O', 'O', 'O', 'O', 'O', 'O'], 'ner_tags': [24, 24, 24, 24, 24, 24]}
```
Example of the one record for `nested`:
```json
{'id': '0', 'tokens': ['"', 'תהיה', 'נקמה', 'ו', 'בגדול', '.'], 'ner_tags': [24, 24, 24, 24, 24, 24], 'ner_tags_2': [24, 24, 24, 24, 24, 24], 'ner_tags_3': [24, 24, 24, 24, 24, 24], 'ner_tags_4': [24, 24, 24, 24, 24, 24]}
```
## Dataset Description
it's README.md of the [original repository](https://github.com/OnlpLab/NEMO-Corpus)
Named Entity (NER) annotations of the Hebrew Treebank (Haaretz newspaper) corpus, including: morpheme and token level NER labels, nested mentions, and more.
We publish the NEMO corpus in the TACL paper [*"Neural Modeling for Named Entities and Morphology (NEMO<sup>2</sup>)"*](https://direct.mit.edu/tacl/article/doi/10.1162/tacl_a_00404/107206/Neural-Modeling-for-Named-Entities-and-Morphology) [1], where we use it in extensive experiments and analyses, showing the importance of morphological boundaries for neural modeling of NER in morphologically rich languages. Code for these models and experiments can be found in the [NEMO code repo](https://github.com/OnlpLab/NEMO).
## Main features:
1. Morpheme, token-single and token-multi sequence labels. Morpheme labels provide exact boundaries, token-multi provide partial sub-word morphological but no exact boundaries, token-single provides only token-level information.
1. All annotations are in `BIOSE` format (`B`=Begin, `I`=Inside, `O`=Outside, `S`=Singleton, `E`=End).
1. Widely-used OntoNotes entity category set: `GPE` (geo-political entity), `PER` (person), `LOC` (location), `ORG` (organization), `FAC` (facility), `EVE` (event), `WOA` (work-of-art), `ANG` (language), `DUC` (product).
1. NEMO includes NER annotations for the two major versions of the Hebrew Treebank, UD (Universal Dependency) and SPMRL. These can be aligned to the other morphosyntactic information layers of the treebank using [bclm](https://github.com/OnlpLab/bclm)
1. We provide nested mentions. Only the first, widest, layer is used in the NEMO<sup>2</sup> paper. We invite you to take on this challenge!
1. Guidelines used for annotation are provided [here](./guidelines/).
1. Corpus was annotated by two native Hebrew speakers of academic education, and curated by the project manager. We provide the original annotations made by the annotators as well to promote work on [learning with disagreements](https://sites.google.com/view/semeval2021-task12/home).
1. Annotation was performed using [WebAnno](https://webanno.github.io/webanno/) (version 3.4.5)
## Legend for Files and Folder Structure
1. The two main [data](./data/) folders are [ud](./data/ud/) and [spmrl](./data/spmrl/), corresponding to the relevant Hebrew Treebank corpus version.
1. Both contain a `gold` folder ([spmrl/gold](./data/spmrl/gold/), [ud/gold](./data/ud/gold/)) of gold curated annotations.
1. Each `gold` folder contains files of the three input-output variants (morph, token-multi, token-single), for each of the treebank splits (train,dev,test).
1. Each `gold` folder also contains a `nested` subfolder ([spmrl/nested](./data/spmrl/gold/nested/), [ud/nested](./data/ud/gold/nested/)), which contains all layers of nested mentions (the first layer is the layer used in the non-nested files, and in the NEMO<sup>2</sup> paper [1])
1. The `ud` folder also contains an [ab_annotators](./data/ud/ab_annotators/) folder. This folder contains the original annotations made by each annotator (named `a`, `b`), including first-layer and nested annotatations.
1. *\*UPDATE 2021-09-06\** `ud` folder now contains a [pilot_annotations](./data/ud/pilot_annotations/) folder. This folder contains the original annotations made by each annotator in our two phase pilot (phase I - sentences 1-200 of dev; phase II - sentences 201-400 of dev).
## Basic Corpus Statistics
| | train | dev | test |
|------------------------------| --:| --:| --:|
| Sentences | 4,937 | 500 | 706 |
| Tokens | 93,504 | 8,531 | 12,619 |
| Morphemes | 127,031 | 11,301 | 16,828 |
| All mentions | 6,282 | 499 | 932 |
| Type: Person (PER) | 2,128 | 193 | 267 |
| Type: Organization (ORG) | 2,043 | 119 | 408 |
| Type: Geo-Political (GPE) | 1,377 | 121 | 195 |
| Type: Location (LOC) | 331 | 28 | 41 |
| Type: Facility (FAC) | 163 | 12 | 11 |
| Type: Work-of-Art (WOA) | 114 | 9 | 6 |
| Type: Event (EVE) | 57 | 12 | 0 |
| Type: Product (DUC) | 36 | 2 | 3 |
| Type: Language (ANG) | 33 | 3 | 1 |
## Aligned Treenbank Versions
The NEMO corpus matches the treebank version of [bclm v.1.0.0](https://github.com/OnlpLab/bclm/releases/tag/v1.0.0-alpha).
This version is based on the [HTB UD v2.2](https://github.com/UniversalDependencies/UD_Hebrew-HTB/releases/tag/r2.2) and the [latest SPMRL HTB version](https://github.com/OnlpLab/HebrewResources/tree/102674bb030f5836e1ab827feb63954ad7a6f8fe/HebrewTreebank/hebtb).
The changes contain (but might not be limited to the following):
1. Flagged and dropped duplicate and leaking sentences (between train and test). In addition to the sentences already removed in the bclm v1.0.0 HTB version, the following duplicate sentences were dropped as well (SPMRL sentence IDs): 5438, 5444, 5445, 5446, 5448, 5449, 5450, 5451, 5453, 5459 (in the bclm dataframes, these are marked in the `duplicate_sent_id` column).
To read the treebank (UD/SPMRL) in a way that matches the NEMO corpus, you can use the following:
```python
import bclm
dropped = [5438, 5444, 5445, 5446, 5448, 5449, 5450, 5451, 5453, 5459]
spdf = bclm.read_dataframe('spmrl') # load SPMRL treebank dataframe
global_dropped = [spdf[spdf.sent_id==d].global_sent_id.iat[0] for d in dropped]
uddf = bclm.read_dataframe('ud') # load UD treebank dataframe
uddf = uddf[(~uddf.global_sent_id.isin(global_dropped))] # remove extra duplicates
spdf = spdf[(~spdf.sent_id.isin(dropped))] # remove extra duplicates
# The resulting dataframes contain gold morph NER labels in the `biose_layer0`, `biose_layer1`... columns.
```
2. The UD treebank contains many more duplicates. In this version: all sentences exist in both UD and SPMRL versions, and all sentences and tokens are aligned between UD and SPMRL.
2. Fixed numbers that were originally reversed.
2. Fixed mismatches between tokens and morphemes.
2. Added Binyan feature.
2. No individual morphemes or tokens were added or removed, only complete sentences.
## Evaluation
An evaluation script is provided in the [NEMO code repo](https://github.com/OnlpLab/NEMO#evaluation) along with evaluation instructions.
## Citations
##### [1]
If you use the NEMO corpus in your research, please cite the NEMO<sup>2</sup> paper:
```bibtex
@article{10.1162/tacl_a_00404,
author = {Bareket, Dan and Tsarfaty, Reut},
title = "{Neural Modeling for Named Entities and Morphology (NEMO2)}",
journal = {Transactions of the Association for Computational Linguistics},
volume = {9},
pages = {909-928},
year = {2021},
month = {09},
abstract = "{Named Entity Recognition (NER) is a fundamental NLP task, commonly formulated as classification over a sequence of tokens. Morphologically rich languages (MRLs) pose a challenge to this basic formulation, as the boundaries of named entities do not necessarily coincide with token boundaries, rather, they respect morphological boundaries. To address NER in MRLs we then need to answer two fundamental questions, namely, what are the basic units to be labeled, and how can these units be detected and classified in realistic settings (i.e., where no gold morphology is available). We empirically investigate these questions on a novel NER benchmark, with parallel token- level and morpheme-level NER annotations, which we develop for Modern Hebrew, a morphologically rich-and-ambiguous language. Our results show that explicitly modeling morphological boundaries leads to improved NER performance, and that a novel hybrid architecture, in which NER precedes and prunes morphological decomposition, greatly outperforms the standard pipeline, where morphological decomposition strictly precedes NER, setting a new performance bar for both Hebrew NER and Hebrew morphological decomposition tasks.}",
issn = {2307-387X},
doi = {10.1162/tacl_a_00404},
url = {https://doi.org/10.1162/tacl\_a\_00404},
eprint = {https://direct.mit.edu/tacl/article-pdf/doi/10.1162/tacl\_a\_00404/1962472/tacl\_a\_00404.pdf},
}
```
##### [2]
Please cite the Hebrew Treebank as well, described the following paper:
```bibtex
@article{sima2001building,
title={Building a tree-bank of modern Hebrew text},
author={Sima’an, Khalil and Itai, Alon and Winter, Yoad and Altman, Alon and Nativ, Noa},
journal={Traitement Automatique des Langues},
volume={42},
number={2},
pages={247--380},
year={2001},
publisher={Citeseer}
}
```
##### [3]
The UD version of the Hebrew Treebank is described in:
```bibtex
@inproceedings{sade-etal-2018-hebrew,
title = "The {H}ebrew {U}niversal {D}ependency Treebank: Past Present and Future",
author = "Sade, Shoval and
Seker, Amit and
Tsarfaty, Reut",
booktitle = "Proceedings of the Second Workshop on Universal Dependencies ({UDW} 2018)",
month = nov,
year = "2018",
address = "Brussels, Belgium",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/W18-6016",
doi = "10.18653/v1/W18-6016",
pages = "133--143",
abstract = "The Hebrew treebank (HTB), consisting of 6221 morpho-syntactically annotated newspaper sentences, has been the only resource for training and validating statistical parsers and taggers for Hebrew, for almost two decades now. During these decades, the HTB has gone through a trajectory of automatic and semi-automatic conversions, until arriving at its UDv2 form. In this work we manually validate the UDv2 version of the HTB, and, according to our findings, we apply scheme changes that bring the UD HTB to the same theoretical grounds as the rest of UD. Our experimental parsing results with UDv2New confirm that improving the coherence and internal consistency of the UD HTB indeed leads to improved parsing performance. At the same time, our analysis demonstrates that there is more to be done at the point of intersection of UD with other linguistic processing layers, in particular, at the points where UD interfaces external morphological and lexical resources.",
}
``` |
novaDE/novaDE | ---
license: apache-2.0
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
La-matrice/french_sentences_19M | ---
task_categories:
- text-generation
language:
- fr
pretty_name: f
---
## This dataset consists of more than 19 million French sentences.
This diverse collection originates from a variety of sources, including books, songs, Wikipedia, and translation datasets. |
sandrocaseiro/fashionpedia | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int64
- name: height
dtype: int64
- name: objects
struct:
- name: bbox_id
sequence: int64
- name: bbox
sequence:
sequence: float64
- name: category
sequence:
class_label:
names:
'0': shirt, blouse
'1': top, t-shirt, sweatshirt
'2': sweater
'3': cardigan
'4': jacket
'5': vest
'6': pants
'7': shorts
'8': skirt
'9': coat
'10': dress
'11': jumpsuit
'12': cape
'13': glasses
'14': hat
'15': headband, head covering, hair accessory
'16': tie
'17': glove
'18': watch
'19': belt
'20': leg warmer
'21': tights, stockings
'22': sock
'23': shoe
'24': bag, wallet
'25': scarf
'26': umbrella
'27': hood
'28': collar
'29': lapel
'30': epaulette
'31': sleeve
'32': pocket
'33': neckline
'34': buckle
'35': zipper
'36': applique
'37': bead
'38': bow
'39': flower
'40': fringe
'41': ribbon
'42': rivet
'43': ruffle
'44': sequin
'45': tassel
- name: area
sequence: int64
- name: segmentation
sequence:
sequence:
sequence: int64
splits:
- name: train
num_bytes: 3812764522.759
num_examples: 45623
- name: val
num_bytes: 100185461.28
num_examples: 1158
download_size: 3519915966
dataset_size: 3912949984.039
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
---
# Dataset Card for "fashionpedia"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FarmerlineML/mampruli_dataset | ---
dataset_info:
features:
- name: transcription
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 470190988.884
num_examples: 5818
- name: test
num_bytes: 59963516.0
num_examples: 742
download_size: 527735941
dataset_size: 530154504.884
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
mlxen/squad_contrasting_validation_dataset | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: validation
num_bytes: 10482482
num_examples: 10570
download_size: 1835309
dataset_size: 10482482
---
# Dataset Card for "squad_contrasting_validation_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sbslee/test | ---
license: mit
---
|
DeepFoldProtein/foldseek_combined_processed_unigram32000_512 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: special_tokens_mask
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2778002512
num_examples: 386693
download_size: 848577616
dataset_size: 2778002512
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_mlabonne__NeuralDarewin-7B | ---
pretty_name: Evaluation run of mlabonne/NeuralDarewin-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mlabonne/NeuralDarewin-7B](https://huggingface.co/mlabonne/NeuralDarewin-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__NeuralDarewin-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T17:48:56.790250](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralDarewin-7B/blob/main/results_2024-02-01T17-48-56.790250.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6520779999242557,\n\
\ \"acc_stderr\": 0.03214755914196501,\n \"acc_norm\": 0.6530571027875921,\n\
\ \"acc_norm_stderr\": 0.03279768840920175,\n \"mc1\": 0.4675642594859241,\n\
\ \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6291675924515658,\n\
\ \"mc2_stderr\": 0.015571699922487066\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6629692832764505,\n \"acc_stderr\": 0.013813476652902274,\n\
\ \"acc_norm\": 0.7013651877133106,\n \"acc_norm_stderr\": 0.01337407861506874\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.68442541326429,\n \
\ \"acc_stderr\": 0.004637944965914613,\n \"acc_norm\": 0.8639713204540929,\n\
\ \"acc_norm_stderr\": 0.0034211839093201612\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368881,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368881\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n\
\ \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n\
\ \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n\
\ \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n\
\ \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n\
\ \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\"\
: 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.025355741263055263,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.025355741263055263\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n\
\ \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700476,\n \"\
acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700476\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.02675640153807897,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02675640153807897\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8403575989782887,\n\
\ \"acc_stderr\": 0.013097934513263005,\n \"acc_norm\": 0.8403575989782887,\n\
\ \"acc_norm_stderr\": 0.013097934513263005\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39664804469273746,\n\
\ \"acc_stderr\": 0.016361354769822468,\n \"acc_norm\": 0.39664804469273746,\n\
\ \"acc_norm_stderr\": 0.016361354769822468\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7654320987654321,\n \"acc_stderr\": 0.023576881744005723,\n\
\ \"acc_norm\": 0.7654320987654321,\n \"acc_norm_stderr\": 0.023576881744005723\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.012743072942653347,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.012743072942653347\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.02757646862274054,\n\
\ \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.02757646862274054\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274645,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274645\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4675642594859241,\n\
\ \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6291675924515658,\n\
\ \"mc2_stderr\": 0.015571699922487066\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7971586424625099,\n \"acc_stderr\": 0.011301439925936652\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6671721000758151,\n \
\ \"acc_stderr\": 0.012979892496598287\n }\n}\n```"
repo_url: https://huggingface.co/mlabonne/NeuralDarewin-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|arc:challenge|25_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|gsm8k|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hellaswag|10_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T17-48-56.790250.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T17-48-56.790250.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- '**/details_harness|winogrande|5_2024-02-01T17-48-56.790250.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T17-48-56.790250.parquet'
- config_name: results
data_files:
- split: 2024_02_01T17_48_56.790250
path:
- results_2024-02-01T17-48-56.790250.parquet
- split: latest
path:
- results_2024-02-01T17-48-56.790250.parquet
---
# Dataset Card for Evaluation run of mlabonne/NeuralDarewin-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/NeuralDarewin-7B](https://huggingface.co/mlabonne/NeuralDarewin-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__NeuralDarewin-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T17:48:56.790250](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__NeuralDarewin-7B/blob/main/results_2024-02-01T17-48-56.790250.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6520779999242557,
"acc_stderr": 0.03214755914196501,
"acc_norm": 0.6530571027875921,
"acc_norm_stderr": 0.03279768840920175,
"mc1": 0.4675642594859241,
"mc1_stderr": 0.017466632149577613,
"mc2": 0.6291675924515658,
"mc2_stderr": 0.015571699922487066
},
"harness|arc:challenge|25": {
"acc": 0.6629692832764505,
"acc_stderr": 0.013813476652902274,
"acc_norm": 0.7013651877133106,
"acc_norm_stderr": 0.01337407861506874
},
"harness|hellaswag|10": {
"acc": 0.68442541326429,
"acc_stderr": 0.004637944965914613,
"acc_norm": 0.8639713204540929,
"acc_norm_stderr": 0.0034211839093201612
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368881,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368881
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055263,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055263
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700476,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700476
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02675640153807897,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02675640153807897
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579647,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579647
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098823,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098823
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8403575989782887,
"acc_stderr": 0.013097934513263005,
"acc_norm": 0.8403575989782887,
"acc_norm_stderr": 0.013097934513263005
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39664804469273746,
"acc_stderr": 0.016361354769822468,
"acc_norm": 0.39664804469273746,
"acc_norm_stderr": 0.016361354769822468
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7654320987654321,
"acc_stderr": 0.023576881744005723,
"acc_norm": 0.7654320987654321,
"acc_norm_stderr": 0.023576881744005723
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653347,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653347
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.02757646862274054,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.02757646862274054
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.01890101532209309,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.01890101532209309
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274645,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4675642594859241,
"mc1_stderr": 0.017466632149577613,
"mc2": 0.6291675924515658,
"mc2_stderr": 0.015571699922487066
},
"harness|winogrande|5": {
"acc": 0.7971586424625099,
"acc_stderr": 0.011301439925936652
},
"harness|gsm8k|5": {
"acc": 0.6671721000758151,
"acc_stderr": 0.012979892496598287
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
bulico/GWPARACLONE | ---
license: openrail
---
|
karlen532/cosql | ---
license: unknown
---
|
epinnock/smol-evol-feedback-1k-oai-format | ---
dataset_info:
features:
- name: messages
dtype: string
splits:
- name: train
num_bytes: 7068720
num_examples: 1291
download_size: 2695860
dataset_size: 7068720
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
somosnlp/recetas-cocina | ---
license: mit
task_categories:
- table-question-answering
- text-generation
language:
- es
pretty_name: recetas de cocina
size_categories:
- 10K<n<100K
--- |
anarenteriare/test-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 17746224.0
num_examples: 51
download_size: 15937251
dataset_size: 17746224.0
---
# Dataset Card for "test-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Damitrius/Tester | ---
license: unknown
---
|
adamoudaimah/products | ---
license: mit
---
|
Berzerker/ICDAR_RIMES_ocr_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: output_json_dumpsed
dtype: string
configs:
- config_name: default
data_files:
- split: train
path: data/*.parquet
language:
- en
---
|
Umal-exvc/test-captioned-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 111187.0
num_examples: 5
download_size: 111705
dataset_size: 111187.0
---
# Dataset Card for "test-captioned-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_198 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 20214166032.625
num_examples: 210459
download_size: 18270712837
dataset_size: 20214166032.625
---
# Dataset Card for "chunk_198"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
piebro/factorio-blueprint-visualizations | ---
license: cc0-1.0
task_categories:
- text-to-image
tags:
- art
pretty_name: Factorio Blueprint Visualizations Dataset
size_categories:
- n<1K
---
## Dataset Description
This dataset is a collection of visualizations of [Factorio Blueprints](https://wiki.factorio.com/Blueprint) using this Factorio Visualization Tool: https://github.com/piebro/factorio-blueprint-visualizer. The Blueprints are collected from https://www.factorio.school/.
## Examples


## Dataset Structure
* "svg_original": The svg downloaded like this from the website
* "svg_rect": The svg reshaped to a rect and a slightly bigger border
* "png_1024x1024": The svg_rect images exported as pngs
## Additional Information
The dataset was used to train this lora: https://huggingface.co/piebro/factorio-blueprint-visualizations-sdxl-lora
## Code Attachments
Code to create the rectangular svgs:
```python
import os
import xml.etree.ElementTree as ET
def modify_svg(save_dir, svg_file_path):
tree = ET.parse(svg_file_path)
root = tree.getroot()
# Extract current width and height
width = float(root.attrib['width'].replace('mm', ''))
height = float(root.attrib['height'].replace('mm', ''))
# Calculate new dimensions
new_size = max(width, height) + 200
# Update width and height
root.attrib['width'] = f"{new_size}mm"
root.attrib['height'] = f"{new_size}mm"
# Adjust viewBox for centering content
view_box = root.attrib.get('viewBox', '').split(',')
if len(view_box) == 4:
x, y, vw, vh = map(float, view_box)
dx = vw*0.12
dy = vh*0.12
root.attrib['viewBox'] = f"{x-dx/2}, {y-dy/2}, {vw+dx}, {vh+dy}"
# Write back to file or a new file
tree.write(os.path.join(save_dir, f"modified_{os.path.basename(svg_file_path)}"))
save_dir = ""
original_svg_folder_path = ""
for file_name in os.listdir(original_svg_folder_path):
if file_name.endswith('.svg'):
modify_svg(save_dir, os.path.join(original_svg_folder_path, file_name))
```
Code to create the pngs:
```bash
mkdir pngs
for file in *.svg; do convert "$file" -resize 1024x1024 "pngs/${file%.svg}.png"; done
``` |
neuclir/hc4 | ---
annotations_creators:
- no-annotation
language:
- fa
- ru
- zh
language_creators:
- found
license:
- odc-by
multilinguality:
- multilingual
pretty_name: HC4
size_categories:
- 1M<n<10M
source_datasets:
- extended|c4
tags: []
task_categories:
- text-retrieval
task_ids:
- document-retrieval
---
# Dataset Card for HC4
## Dataset Description
- **Repository:** https://github.com/hltcoe/HC4
- **Paper:** https://arxiv.org/abs/2201.09992
### Dataset Summary
HC4 is a suite of test collections for ad hoc Cross-Language Information Retrieval (CLIR), with Common Crawl News documents in Chinese, Persian, and Russian. The documents
are Web pages from Common Crawl in Chinese, Persian, and Russian.
### Languages
- Chinese
- Persian
- Russian
## Dataset Structure
### Data Instances
| Split | Documents |
|-----------------|----------:|
| `fas` (Persian) | 486K |
| `rus` (Russian) | 4.7M |
| `zho` (Chinese) | 646K |
### Data Fields
- `id`: unique identifier for this document
- `cc_file`: source file from connon crawl
- `time`: extracted date/time from article
- `title`: title extracted from article
- `text`: extracted article body
- `url`: source URL
## Dataset Usage
Using 🤗 Datasets:
```python
from datasets import load_dataset
dataset = load_dataset('neuclir/hc4')
dataset['fas'] # Persian documents
dataset['rus'] # Russian documents
dataset['zho'] # Chinese documents
```
## Citation Information
```
@article{Lawrie2022HC4,
author = {Dawn Lawrie and James Mayfield and Douglas W. Oard and Eugene Yang},
title = {HC4: A New Suite of Test Collections for Ad Hoc CLIR},
booktitle = {{Advances in Information Retrieval. 44th European Conference on IR Research (ECIR 2022)},
year = {2022},
month = apr,
publisher = {Springer},
series = {Lecture Notes in Computer Science},
site = {Stavanger, Norway},
url = {https://arxiv.org/abs/2201.09992}
}
```
|
open-llm-leaderboard/details_macadeliccc__SOLAR-10.7b-Instruct-truthy-dpo | ---
pretty_name: Evaluation run of macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo](https://huggingface.co/macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__SOLAR-10.7b-Instruct-truthy-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T19:40:32.178744](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__SOLAR-10.7b-Instruct-truthy-dpo/blob/main/results_2024-02-01T19-40-32.178744.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6578797312429587,\n\
\ \"acc_stderr\": 0.03193533127801232,\n \"acc_norm\": 0.6595146193672972,\n\
\ \"acc_norm_stderr\": 0.03257985736954445,\n \"mc1\": 0.6046511627906976,\n\
\ \"mc1_stderr\": 0.017115815632418208,\n \"mc2\": 0.7675318116403941,\n\
\ \"mc2_stderr\": 0.01417571671037387\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6953924914675768,\n \"acc_stderr\": 0.013449522109932487,\n\
\ \"acc_norm\": 0.7209897610921502,\n \"acc_norm_stderr\": 0.01310678488360134\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.710017924716192,\n\
\ \"acc_stderr\": 0.004528264116475881,\n \"acc_norm\": 0.8843855805616411,\n\
\ \"acc_norm_stderr\": 0.003191084792793155\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339526,\n\
\ \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339526\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48412698412698413,\n \"acc_stderr\": 0.025738330639412152,\n \"\
acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.025738330639412152\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8032258064516129,\n \"acc_stderr\": 0.022616409420742025,\n \"\
acc_norm\": 0.8032258064516129,\n \"acc_norm_stderr\": 0.022616409420742025\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.035179450386910616,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603627,\n \"\
acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603627\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.02293514405391943,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.02293514405391943\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"\
acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650159,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650159\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8354430379746836,\n \"acc_stderr\": 0.024135736240566932,\n \
\ \"acc_norm\": 0.8354430379746836,\n \"acc_norm_stderr\": 0.024135736240566932\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560403,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.023357365785874037,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.023357365785874037\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n\
\ \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n\
\ \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262196,\n\
\ \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262196\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48891786179921776,\n\
\ \"acc_stderr\": 0.012767098998525843,\n \"acc_norm\": 0.48891786179921776,\n\
\ \"acc_norm_stderr\": 0.012767098998525843\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.02655651947004151,\n\
\ \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.02655651947004151\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399683,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399683\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6046511627906976,\n\
\ \"mc1_stderr\": 0.017115815632418208,\n \"mc2\": 0.7675318116403941,\n\
\ \"mc2_stderr\": 0.01417571671037387\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8271507498026835,\n \"acc_stderr\": 0.010626964529971862\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5921152388172858,\n \
\ \"acc_stderr\": 0.01353674207564309\n }\n}\n```"
repo_url: https://huggingface.co/macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|arc:challenge|25_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|gsm8k|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hellaswag|10_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T19-40-32.178744.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T19-40-32.178744.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- '**/details_harness|winogrande|5_2024-02-01T19-40-32.178744.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T19-40-32.178744.parquet'
- config_name: results
data_files:
- split: 2024_02_01T19_40_32.178744
path:
- results_2024-02-01T19-40-32.178744.parquet
- split: latest
path:
- results_2024-02-01T19-40-32.178744.parquet
---
# Dataset Card for Evaluation run of macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo](https://huggingface.co/macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__SOLAR-10.7b-Instruct-truthy-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T19:40:32.178744](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__SOLAR-10.7b-Instruct-truthy-dpo/blob/main/results_2024-02-01T19-40-32.178744.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6578797312429587,
"acc_stderr": 0.03193533127801232,
"acc_norm": 0.6595146193672972,
"acc_norm_stderr": 0.03257985736954445,
"mc1": 0.6046511627906976,
"mc1_stderr": 0.017115815632418208,
"mc2": 0.7675318116403941,
"mc2_stderr": 0.01417571671037387
},
"harness|arc:challenge|25": {
"acc": 0.6953924914675768,
"acc_stderr": 0.013449522109932487,
"acc_norm": 0.7209897610921502,
"acc_norm_stderr": 0.01310678488360134
},
"harness|hellaswag|10": {
"acc": 0.710017924716192,
"acc_stderr": 0.004528264116475881,
"acc_norm": 0.8843855805616411,
"acc_norm_stderr": 0.003191084792793155
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339526,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339526
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.025738330639412152,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.025738330639412152
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.025545650426603627,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.025545650426603627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.02293514405391943,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.02293514405391943
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650159,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650159
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8354430379746836,
"acc_stderr": 0.024135736240566932,
"acc_norm": 0.8354430379746836,
"acc_norm_stderr": 0.024135736240566932
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560403,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41787709497206704,
"acc_stderr": 0.016495400635820084,
"acc_norm": 0.41787709497206704,
"acc_norm_stderr": 0.016495400635820084
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262196,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262196
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48891786179921776,
"acc_stderr": 0.012767098998525843,
"acc_norm": 0.48891786179921776,
"acc_norm_stderr": 0.012767098998525843
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.02655651947004151,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.02655651947004151
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399683,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399683
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6046511627906976,
"mc1_stderr": 0.017115815632418208,
"mc2": 0.7675318116403941,
"mc2_stderr": 0.01417571671037387
},
"harness|winogrande|5": {
"acc": 0.8271507498026835,
"acc_stderr": 0.010626964529971862
},
"harness|gsm8k|5": {
"acc": 0.5921152388172858,
"acc_stderr": 0.01353674207564309
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
llm-aes/meva_original | ---
dataset_info:
features:
- name: premise
dtype: string
- name: generator
dtype: string
- name: story
dtype: string
- name: human_score
dtype: float64
splits:
- name: train
num_bytes: 1168303
num_examples: 1000
download_size: 670476
dataset_size: 1168303
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
fulflimall/allfile | ---
license: unlicense
---
|
adityab99/Automobiles | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Airplane
'1': Bike
'2': Formula 1 cars
'3': Normal car
splits:
- name: train
num_bytes: 14523967.75
num_examples: 510
- name: test
num_bytes: 2583269.25
num_examples: 90
download_size: 17068988
dataset_size: 17107237.0
---
# Dataset Card for "Automobiles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AyoubChLin/20NewsGroup-AgNews-CnnNews | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
- name: labels
dtype:
class_label:
names:
'0': auto
'1': business
'2': entertainment
'3': health
'4': news
'5': politics
'6': sci/tech
'7': sport
'8': world
splits:
- name: train
num_bytes: 227672680
num_examples: 162076
download_size: 134277697
dataset_size: 227672680
task_categories:
- text-classification
language:
- en
size_categories:
- n<1K
---
|
arbml/Ashaar_aruid_v0 | ---
dataset_info:
features:
- name: sequence
dtype: string
- name: tafeelah
dtype: string
- name: meter
dtype: string
splits:
- name: train
num_bytes: 78684
num_examples: 986
download_size: 18630
dataset_size: 78684
---
# Dataset Card for "Ashaar_ardui"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/trento_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of trento/トレント/特伦托 (Azur Lane)
This is the dataset of trento/トレント/特伦托 (Azur Lane), containing 60 images and their tags.
The core tags of this character are `long_hair, breasts, hair_over_one_eye, large_breasts, purple_hair, red_eyes, bangs, very_long_hair, eyewear_on_head, sunglasses, blue_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 60 | 87.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trento_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 60 | 47.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trento_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 145 | 106.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trento_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 60 | 76.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trento_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 145 | 151.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trento_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/trento_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, black_bikini, cleavage, navel, solo, blush, looking_at_viewer, o-ring_bikini, bare_shoulders, smile, thigh_strap, collarbone, wrist_scrunchie, black_choker, thighs, bead_bracelet, open_mouth, simple_background, stomach, official_alternate_costume, side-tie_bikini_bottom, closed_mouth, multi-strapped_bikini, o-ring_top, mole, thigh_gap, wet, white_background |
| 1 | 5 |  |  |  |  |  | black_bikini, blue_sky, day, looking_at_viewer, navel, official_alternate_costume, open_mouth, 1girl, cleavage, cowboy_shot, multi-strapped_bikini, o-ring_bikini, outdoors, solo, :d, cloud, collarbone, side-tie_bikini_bottom, black_choker, bracelet, halterneck, ocean, skindentation, standing, thigh_strap |
| 2 | 12 |  |  |  |  |  | looking_at_viewer, 1girl, solo, white_gloves, cape, garter_straps, simple_background, smile, epaulettes, white_background, blush, dress, standing, black_thighhighs, boots, cleavage, red_necktie |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_bikini | cleavage | navel | solo | blush | looking_at_viewer | o-ring_bikini | bare_shoulders | smile | thigh_strap | collarbone | wrist_scrunchie | black_choker | thighs | bead_bracelet | open_mouth | simple_background | stomach | official_alternate_costume | side-tie_bikini_bottom | closed_mouth | multi-strapped_bikini | o-ring_top | mole | thigh_gap | wet | white_background | blue_sky | day | cowboy_shot | outdoors | :d | cloud | bracelet | halterneck | ocean | skindentation | standing | white_gloves | cape | garter_straps | epaulettes | dress | black_thighhighs | boots | red_necktie |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------|:--------|:-------|:--------|:--------------------|:----------------|:-----------------|:--------|:--------------|:-------------|:------------------|:---------------|:---------|:----------------|:-------------|:--------------------|:----------|:-----------------------------|:-------------------------|:---------------|:------------------------|:-------------|:-------|:------------|:------|:-------------------|:-----------|:------|:--------------|:-----------|:-----|:--------|:-----------|:-------------|:--------|:----------------|:-----------|:---------------|:-------|:----------------|:-------------|:--------|:-------------------|:--------|:--------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | X | | | X | X | | X | | | X | | | X | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | | X | | X | X | X | | | X | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
SminC/cartoonizer-dataset | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: cartoonized_image
dtype: image
splits:
- name: train
num_bytes: 31770277.0
num_examples: 50
download_size: 31772590
dataset_size: 31770277.0
---
# Dataset Card for "cartoonizer-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hkufyp2024/test | ---
license: apache-2.0
---
|
ashish23/filtered_wikibook | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2567855206.0077558
num_examples: 8433293
- name: test
num_bytes: 7727048.207960854
num_examples: 25377
download_size: 11760058784
dataset_size: 2575582254.215717
---
# Dataset Card for "filtered_wikibook"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HSnake/happy | ---
license: apache-2.0
---
|
weqweasdas/openchat_model0_data_with_rewards | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: type
dtype: string
- name: instances
list:
- name: prompt
dtype: string
- name: responses
sequence: string
- name: rewards
sequence: float64
splits:
- name: train
num_bytes: 164177578
num_examples: 1
download_size: 73760476
dataset_size: 164177578
---
# Dataset Card for "openchat_model0_data_with_rewards"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qbwmwsap/stackoverflow_processed | ---
dataset_info:
features:
- name: token_ids
sequence: int64
- name: source
dtype: string
splits:
- name: train
num_bytes: 176193061000
num_examples: 10720600
download_size: 39208740499
dataset_size: 176193061000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MarkChen1214/SemCor | ---
dataset_info:
features:
- name: ID
sequence: int64
- name: Word
sequence: string
- name: Lemma
sequence: string
- name: POS
sequence: string
- name: Definition
sequence: string
- name: Lemma_sentence
dtype: string
- name: sentence
dtype: string
- name: Lemma_tfidf
sequence: string
- name: Lemma_tfidf_value
sequence: float64
splits:
- name: train
num_bytes: 24209901
num_examples: 20138
download_size: 8568417
dataset_size: 24209901
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- text-classification
language:
- en
size_categories:
- 1K<n<10K
---
# Dataset Card for "SemCor – sense-tagged English corpus"
## Description
This dataset is derived from the [wsd_semcor dataset](https://huggingface.co/datasets/spdenisov/wsd_semcor), originally hosted on Hugging Face. It has been preprocessed for tasks related to Word Sense Disambiguation (WSD) and WordNet integration.
## Preprocessing
The original text data underwent the following preprocessing steps:
- Text splitting into individual words (lemmas).
- TF-IDF (Term Frequency-Inverse Document Frequency) analysis to understand the importance of words within the documents.
## Structure
The dataset contains:
- Lemmas: Words obtained from splitting the text data.
- TF-IDF values: Quantitative measures of word importance within the documents.
## Note
The number of elements in **Lemma** and **Lemma_tfidf** might not match. This is because **Lemma** is based on original dataset and might contain compound words, which might not be recognized by TF-IDF algorithm.
## Intended Use
This dataset is intended for use in WSD and WordNet integration tasks. It provides foundational data for natural language processing (NLP) research and applications, specifically focusing on understanding word meanings and contextual usage.
## Citation
Data sourced from [wsd_semcor dataset](https://huggingface.co/datasets/spdenisov/wsd_semcor) on Hugging Face.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yaful/DeepfakeTextDetect | ---
license: apache-2.0
---
<div align="center">
<h1>Deepfake Text Detection in the Wild</h1>
<!-- **Authors:** -->
_**Yafu Li<sup>†</sup><sup>‡</sup>, Qintong Li<sup>§</sup>, Leyang Cui<sup>¶</sup>, Wei Bi<sup>¶</sup>,<br>**_
_**Longyue Wang<sup>¶</sup>, Linyi Yang<sup>‡</sup>, Shuming Shi<sup>¶</sup>, Yue Zhang<sup>‡</sup><br>**_
<!-- **Affiliations:** -->
_<sup>†</sup> Zhejiang University,
<sup>‡</sup> Westlake University,
<sup>§</sup> The University of Hong Kong,
<sup>¶</sup> Tencent AI Lab_
Presenting a comprehensive benchmark dataset designed to assess the proficiency of deepfake detectors amidst real-world scenarios.
</div>
## 📌 Table of Contents
- [Introduction](#🚀-introduction)
- [Dataset](#📝-dataset)
- [Try Detection](#🖥%EF%B8%8F-try-detection)
- [Citation](#📚-citation)
## 🚀 Introduction
Recent advances in large language models have enabled them to reach a level of text generation comparable to that of humans.
These models show powerful capabilities across a wide range of content, including news article writing, story generation, and scientific writing.
Such capability further narrows the gap between human-authored and machine-generated texts, highlighting the importance of deepfake text detection to avoid potential risks such as fake news propagation and plagiarism.
In practical scenarios, the detector faces texts from various domains or LLMs without knowing their sources.
To this end, we build **a comprehensive testbed for deepfake text detection**, by gathering texts from various human writings and deepfake texts generated by different LLMs.
The data in this repository is used to evaluate the effectiveness of deepfake detection methods, as described in our paper titled "Deepfake Text Detection in the Wild" (available at https://arxiv.org/abs/2305.13242). We invite you to test your own detection methods on our testbed and encourage you to star our Github repo at https://github.com/yafuly/DeepfakeTextDetect.
## 📝 Dataset
The dataset consists of **447,674** human-written and machine-generated texts from a wide range of sources in the wild:
- Human-written texts from **10 datasets** covering a wide range of writing tasks, e.g., news article writing, story generation, scientific writing, etc.
- Machine-generated texts generated by **27 mainstream LLMs** from 7 sources, e.g., OpenAI, LLaMA, and EleutherAI, etc.
- **6 systematic testbed**s with increasing wildness and detection difficulty.
- **2 wilder test sets**: (1) texts collected from new datasets and generated by GPT-4; (2) paraphrased texts.
### 📥 How to Get the Data
#### 1. Huggingface
You can access the full dataset, which includes the Cross-domains & Cross-models testbed and two additional wilder test sets, through the Huggingface API:
```python
from datasets import load_dataset
dataset = load_dataset("yaful/DeepfakeTextDetect")
```
which includes traditional splits (train.csv, valid.csv and test.csv) and two wilder test sets (test_ood_set_gpt.csv and test_ood_set_gpt_para.csv).
The csv files have three columns: text, label (0 for machine-generated and
1 for human-written) and text source information (e.g., ''cmv_human'' denotes the text is written by humans,
whereas ''roct_machine_continuation_flan_t5_large'' denotes the text is generated by ''flan_t5_large'' using continuation prompt).
To obtain the 6 testbeds mentioned in our paper, simply apply the provided script:
```shell
python3 deployment/prepare_testbeds.py DATA_PATH
```
Replace ''DATA_PATH'' with the output data directory where you want to save the 6 testbeds.
#### 2. Cloud Drive
Alternatively, you can access the 6 testbeds by downloading them directly through [Google Drive](https://drive.google.com/drive/folders/1p09vDiEvoA-ZPmpqkB2WApcwMQWiiMRl?usp=sharing)
or [Tencent Weiyun](https://share.weiyun.com/JUWQxF4H):
The folder contains 4 packages:
- testbeds_processed.zip: 6 testbeds based on the ''processed'' version, which can be directly used for detecting in-distribution and out-of-distribution detection performance.
- wilder_testsets.zip: 2 wilder test sets with texts processed, aiming for (1) detecting deepfake text generated by GPT-4, and (2) detecting deepfake text in paraphrased versions.
- source.zip: Source texts of human-written texts and corresponding texts generated by LLMs, without filtering.
- processed.zip: This is a refined version of the "source" that filters out low-quality texts and specifies sources as CSV file names. For example, the "cmv_machine_specified_gpt-3.5-trubo.csv" file contains texts from the CMV domain generated by the "gpt-3.5-trubo" model using specific prompts, while "cmv_human" includes human-written CMV texts.
## 🖥️ Try Detection
### Model Access
Our Longformer detector, which has been trained on the entire dataset, is now accessible through [Huggingface](https://huggingface.co/nealcly/detection-longformer). Additionally, you can try detection directly using our [online demo](https://huggingface.co/spaces/yaful/DeepfakeTextDetect).
### Deployment
We have refined the decision boundary based on out-of-distribution settings. To ensure optimal performance, we recommend preprocessing texts before sending them to the detector.
See 🏃 [Deepfake Text Detection in the Wild](https://github.com/yafuly/DeepfakeTextDetect) for the complete detection pipeline:
```python
import torch
import os
from transformers import AutoModelForSequenceClassification,AutoTokenizer
from deployment import preprocess, detect
# init
device = 'cpu' # use 'cuda:0' if GPU is available
model_dir = "nealcly/detection-longformer"
tokenizer = AutoTokenizer.from_pretrained(model_dir)
model = AutoModelForSequenceClassification.from_pretrained(model_dir).to(device)
# preprocess
text = preprocess(text)
# detection
result = detect(text,tokenizer,model,device)
```
## 📚 Citation
If you use this dataset in your research, please cite it as follows:
```bibtex
@misc{li2023deepfake,
title={Deepfake Text Detection in the Wild},
author={Yafu Li and Qintong Li and Leyang Cui and Wei Bi and Longyue Wang and Linyi Yang and Shuming Shi and Yue Zhang},
year={2023},
eprint={2305.13242},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
We welcome contributions to improve this dataset! If you have any questions or feedback, please feel free to reach out at yafuly@gmail.com.
<!-- # 🤝 Contributing --> |
jbrinkma/pile-500k | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
- name: meta
struct:
- name: pile_set_name
dtype: string
splits:
- name: train
num_bytes: 2792809210
num_examples: 500000
download_size: 1455096364
dataset_size: 2792809210
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_mnli_existential_there | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 192564
num_examples: 863
- name: dev_mismatched
num_bytes: 173625
num_examples: 709
- name: test_matched
num_bytes: 188543
num_examples: 849
- name: test_mismatched
num_bytes: 161059
num_examples: 717
- name: train
num_bytes: 7743172
num_examples: 33927
download_size: 5200123
dataset_size: 8458963
---
# Dataset Card for "MULTI_VALUE_mnli_existential_there"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Someman/boudhastupa | ---
license: mit
---
|
fathyshalab/massive_music-de | ---
dataset_info:
features:
- name: id
dtype: string
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 18149
num_examples: 332
- name: validation
num_bytes: 3198
num_examples: 56
- name: test
num_bytes: 4440
num_examples: 81
download_size: 17641
dataset_size: 25787
---
# Dataset Card for "massive_music-de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
argilla/distilabel-reasoning-prompts | ---
dataset_info:
features:
- name: instructions
dtype: string
splits:
- name: train
num_bytes: 385228
num_examples: 3000
download_size: 176533
dataset_size: 385228
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
tags:
- synthetic
- distilabel
--- |
joey234/mmlu-conceptual_physics-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 5977
num_examples: 5
- name: test
num_bytes: 1344080
num_examples: 235
download_size: 154457
dataset_size: 1350057
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-conceptual_physics-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kalfrin/emily | ---
license: openrail
---
|
tyzhu/wikitext-103-raw-v1-sent-permute-5 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3271860454
num_examples: 10808095
- name: validation
num_bytes: 1159288
num_examples: 3760
- name: test
num_bytes: 1305088
num_examples: 4358
download_size: 1896601051
dataset_size: 3274324830
---
# Dataset Card for "wikitext-103-raw-v1-sent-permute-5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
owanr/o1o2o3_xl_r2_iterater | ---
dataset_info:
features:
- name: src
dtype: string
- name: tgt
sequence: string
splits:
- name: train
num_bytes: 5038417
num_examples: 7210
download_size: 2029342
dataset_size: 5038417
---
# Dataset Card for "o1o2o3_xl_r2_iterater"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
katielink/liveqa_trec2017 | ---
task_categories:
- question-answering
language:
- en
tags:
- medical
pretty_name: LiveQAMedical
size_categories:
- n<1K
---
# Dataset Card for LiveQA Medical from TREC 2017
The LiveQA'17 medical task focuses on consumer health question answering. Consumer health questions were received by the U.S. National Library of Medicine (NLM).
The dataset consists of constructed medical question-answer pairs for training and testing, with additional annotations that can be used to develop question analysis and question answering systems.
Please refer to our overview paper for more information about the constructed datasets and the LiveQA Track:
Asma Ben Abacha, Eugene Agichtein, Yuval Pinter & Dina Demner-Fushman. Overview of the Medical Question Answering Task at TREC 2017 LiveQA. TREC, Gaithersburg, MD, 2017 (https://trec.nist.gov/pubs/trec26/papers/Overview-QA.pdf).
**Homepage:** [https://github.com/abachaa/LiveQA_MedicalTask_TREC2017](https://github.com/abachaa/LiveQA_MedicalTask_TREC2017)
## Medical Training Data
The dataset provides 634 question-answer pairs for training:
1) TREC-2017-LiveQA-Medical-Train-1.xml => 388 question-answer pairs corresponding to 200 NLM questions.
Each question is divided into one or more subquestion(s). Each subquestion has one or more answer(s).
These question-answer pairs were constructed automatically and validated manually.
2) TREC-2017-LiveQA-Medical-Train-2.xml => 246 question-answer pairs corresponding to 246 NLM questions.
Answers were retrieved manually by librarians.
**You can access them as jsonl**
The datasets are not exhaustive with regards to subquestions, i.e., some subquestions might not be annotated.
Additional annotations are provided for both (i) the Focus and (ii) the Question Type used to define each subquestion.
23 question types were considered (e.g. Treatment, Cause, Diagnosis, Indication, Susceptibility, Dosage) related to four focus categories: Disease, Drug, Treatment and Exam.
## Medical Test Data
Test split can be easily downloaded via huggingface.
Test questions cover 26 question types associated with five focus categories.
Each question includes one or more subquestion(s) and at least one focus and one question type.
Reference answers were selected from trusted resources and validated by medical experts.
At least one reference answer is provided for each test question, its URL and relevant comments.
Question paraphrases were created by assessors and used with the reference answers to judge the participants' answers.
```
If you use these datasets, please cite paper:
@inproceedings{LiveMedQA2017,
author = {Asma {Ben Abacha} and Eugene Agichtein and Yuval Pinter and Dina Demner{-}Fushman},
title = {Overview of the Medical Question Answering Task at TREC 2017 LiveQA},
booktitle = {TREC 2017},
year = {2017}
}
``` |
CyberHarem/fujimura_taiga_fatestaynightufotable | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Fujimura Taiga (Fate Stay Night [UFOTABLE])
This is the dataset of Fujimura Taiga (Fate Stay Night [UFOTABLE]), containing 73 images and their tags.
The core tags of this character are `brown_hair, short_hair, brown_eyes, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 73 | 56.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimura_taiga_fatestaynightufotable/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 73 | 45.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimura_taiga_fatestaynightufotable/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 139 | 85.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimura_taiga_fatestaynightufotable/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 73 | 56.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimura_taiga_fatestaynightufotable/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 139 | 102.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimura_taiga_fatestaynightufotable/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fujimura_taiga_fatestaynightufotable',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, jewelry, solo, smile, anime_coloring, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | jewelry | solo | smile | anime_coloring | looking_at_viewer |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-------|:--------|:-----------------|:--------------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X |
|
ks21/Joe_Buck_the_GOAT | ---
dataset_info:
features:
- name: caption
dtype: string
- name: image
sequence:
sequence:
sequence: uint8
splits:
- name: train
num_bytes: 258171320
num_examples: 40
download_size: 64357844
dataset_size: 258171320
---
# Dataset Card for "Joe_Buck_the_GOAT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_MSL7__INEX16-7b | ---
pretty_name: Evaluation run of MSL7/INEX16-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MSL7/INEX16-7b](https://huggingface.co/MSL7/INEX16-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MSL7__INEX16-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T18:57:35.115210](https://huggingface.co/datasets/open-llm-leaderboard/details_MSL7__INEX16-7b/blob/main/results_2024-03-11T18-57-35.115210.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6520834204942243,\n\
\ \"acc_stderr\": 0.032045377702672045,\n \"acc_norm\": 0.6512033515678863,\n\
\ \"acc_norm_stderr\": 0.03271795035297526,\n \"mc1\": 0.6254589963280294,\n\
\ \"mc1_stderr\": 0.016943535128405306,\n \"mc2\": 0.7735060799995526,\n\
\ \"mc2_stderr\": 0.013825143011491545\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7133105802047781,\n \"acc_stderr\": 0.013214986329274777,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710696\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7168890659231228,\n\
\ \"acc_stderr\": 0.0044958914405194205,\n \"acc_norm\": 0.8909579764987055,\n\
\ \"acc_norm_stderr\": 0.003110549218993895\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4346368715083799,\n\
\ \"acc_stderr\": 0.016578997435496713,\n \"acc_norm\": 0.4346368715083799,\n\
\ \"acc_norm_stderr\": 0.016578997435496713\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n\
\ \"acc_stderr\": 0.012754553719781752,\n \"acc_norm\": 0.47522816166883963,\n\
\ \"acc_norm_stderr\": 0.012754553719781752\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6254589963280294,\n\
\ \"mc1_stderr\": 0.016943535128405306,\n \"mc2\": 0.7735060799995526,\n\
\ \"mc2_stderr\": 0.013825143011491545\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8445146014206788,\n \"acc_stderr\": 0.010184308214775778\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7035633055344959,\n \
\ \"acc_stderr\": 0.012579398235589529\n }\n}\n```"
repo_url: https://huggingface.co/MSL7/INEX16-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|arc:challenge|25_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|gsm8k|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hellaswag|10_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T18-57-35.115210.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T18-57-35.115210.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- '**/details_harness|winogrande|5_2024-03-11T18-57-35.115210.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T18-57-35.115210.parquet'
- config_name: results
data_files:
- split: 2024_03_11T18_57_35.115210
path:
- results_2024-03-11T18-57-35.115210.parquet
- split: latest
path:
- results_2024-03-11T18-57-35.115210.parquet
---
# Dataset Card for Evaluation run of MSL7/INEX16-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MSL7/INEX16-7b](https://huggingface.co/MSL7/INEX16-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MSL7__INEX16-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T18:57:35.115210](https://huggingface.co/datasets/open-llm-leaderboard/details_MSL7__INEX16-7b/blob/main/results_2024-03-11T18-57-35.115210.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6520834204942243,
"acc_stderr": 0.032045377702672045,
"acc_norm": 0.6512033515678863,
"acc_norm_stderr": 0.03271795035297526,
"mc1": 0.6254589963280294,
"mc1_stderr": 0.016943535128405306,
"mc2": 0.7735060799995526,
"mc2_stderr": 0.013825143011491545
},
"harness|arc:challenge|25": {
"acc": 0.7133105802047781,
"acc_stderr": 0.013214986329274777,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710696
},
"harness|hellaswag|10": {
"acc": 0.7168890659231228,
"acc_stderr": 0.0044958914405194205,
"acc_norm": 0.8909579764987055,
"acc_norm_stderr": 0.003110549218993895
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4346368715083799,
"acc_stderr": 0.016578997435496713,
"acc_norm": 0.4346368715083799,
"acc_norm_stderr": 0.016578997435496713
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47522816166883963,
"acc_stderr": 0.012754553719781752,
"acc_norm": 0.47522816166883963,
"acc_norm_stderr": 0.012754553719781752
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6254589963280294,
"mc1_stderr": 0.016943535128405306,
"mc2": 0.7735060799995526,
"mc2_stderr": 0.013825143011491545
},
"harness|winogrande|5": {
"acc": 0.8445146014206788,
"acc_stderr": 0.010184308214775778
},
"harness|gsm8k|5": {
"acc": 0.7035633055344959,
"acc_stderr": 0.012579398235589529
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
vinisebk/rachel | ---
license: openrail
---
|
Han760/traffic_flow_preds | ---
dataset_info:
features:
- name: event_time
dtype: string
- name: hour
dtype: int64
- name: temp
dtype: int64
- name: wd
dtype: int64
- name: ws
dtype: int64
- name: prec1h
dtype: int64
- name: frsn1h
dtype: int64
- name: vis
dtype: int64
- name: pred_traffic_flow
dtype: float64
splits:
- name: train
num_bytes: 88
num_examples: 1
download_size: 4803
dataset_size: 88
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Ricardoxnr/jacksparrowbrricardo | ---
license: openrail
---
|
Shitba/human | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 422619.0
num_examples: 5
download_size: 423323
dataset_size: 422619.0
---
# Dataset Card for "human"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FarmerlineML/twi_dataset_2.0 | ---
license: mit
dataset_info:
features:
- name: transcription
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 867370420.839
num_examples: 2987
- name: test
num_bytes: 117918420.0
num_examples: 445
download_size: 736058272
dataset_size: 985288840.839
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
goodfellowliu/Flickr2K | ---
license: apache-2.0
---
|
zolak/twitter_dataset_1713006073 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 3995274
num_examples: 9933
download_size: 1966893
dataset_size: 3995274
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-from-one-sec-cv12/chunk_172 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1072916448
num_examples: 209064
download_size: 1085478177
dataset_size: 1072916448
---
# Dataset Card for "chunk_172"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sowmya15/Profanity_27_2 | ---
license: apache-2.0
---
|
HEMASENTHIL/Task3 | ---
dataset_info:
features:
- name: English
dtype: string
- name: Thanglish
dtype: string
- name: Text
dtype: string
splits:
- name: train
num_bytes: 7144.5
num_examples: 11
- name: test
num_bytes: 1948.5
num_examples: 3
download_size: 16162
dataset_size: 9093.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
liuyanchen1015/MULTI_VALUE_rte_analytic_superlative | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 101226
num_examples: 242
- name: train
num_bytes: 83833
num_examples: 200
download_size: 127296
dataset_size: 185059
---
# Dataset Card for "MULTI_VALUE_rte_analytic_superlative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-xsum-default-d5c7a7-1507154810 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: morenolq/bart-base-xsum
metrics: ['bertscore']
dataset_name: xsum
dataset_config: default
dataset_split: validation
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: morenolq/bart-base-xsum
* Dataset: xsum
* Config: default
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@morenolq](https://huggingface.co/morenolq) for evaluating this model. |
xiudu/testdata | ---
license: apache-2.0
---
|
turing-motors/LLaVA-Instruct-150K-JA | ---
license: cc-by-nc-4.0
task_categories:
- visual-question-answering
- question-answering
language:
- ja
pretty_name: Japanese LLaVA Visual Instruct 150K
size_categories:
- 100K<n<1M
---
## Dataset Details
**Dataset Type:**
Japanese LLaVA Instruct 150K is a localized version of the original LLaVA Visual Instruct 150K dataset. This version is translated into Japanese using DeepL API and is aimed at serving similar purposes in the context of Japanese language.
**Resources for More Information:**
For information on the original dataset: [LLaVA Visual Instruct 150K](https://llava-vl.github.io/)
**License:**
Attribution-NonCommercial 4.0 International (CC BY-NC-4.0)
The dataset should abide by the policy of OpenAI: [OpenAI Terms of Use](https://openai.com/policies/terms-of-use)
**Questions or Comments:**
For questions or comments about the original model, you can go to [LLaVA GitHub Issues](https://github.com/haotian-liu/LLaVA/issues).
## Intended Use
**Primary Intended Uses:**
The primary use of this translated dataset is research on large multimodal models and chatbots in a Japanese context.
**Primary Intended Users:**
The primary intended users are researchers and hobbyists interested in computer vision, natural language processing, machine learning, and artificial intelligence, particularly those focusing on the Japanese language.
---
**Note:** This dataset is a translation of the original LLaVA Visual Instruct 150K, carried out using the DeepL API. The license remains the same as the original dataset, Attribution-NonCommercial 4.0 International (CC BY-NC-4.0).
---
|
tastypear/bluemoon-cleaned-lewd | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- not-for-all-audiences
---
这个数据集从 grimulkan/bluemoon_Karen_cleaned 中抽取了所有包含NSFW内容的对话(只保留第一轮对话)。
This dataset extracts all conversations containing NSFW content from `grimulkan/bluemoon_Karen_cleaned` (only the first round of conversations is retained)
Explanation of long_response:
```python
if len(chosen) > len(prompt):
long_response = 1
``` |
richfrain/semanticSegmentationv2 | ---
license: apache-2.0
---
|
SimulBench/SimulBench-results | ---
license: mit
task_categories:
- text2text-generation
language:
- en
size_categories:
- n<1K
configs:
- config_name: all
data_files:
- split: test
path: simulbench_all.jsonl
- config_name: hard
data_files:
- split: test
path: simulbench_hard.jsonl
- config_name: objective
data_files:
- split: test
path: simulbench_objective.jsonl
- config_name: subjective
data_files:
- split: test
path: simulbench_subjective.jsonl
- config_name: system
data_files:
- split: test
path: simulbench_system.jsonl
- config_name: tool
data_files:
- split: test
path: simulbench_tool.jsonl
- config_name: role
data_files:
- split: test
path: simulbench_role.jsonl
---
|
derek-thomas/processed-bestofredditorupdates | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: date_utc
dtype: timestamp[ns]
- name: title
dtype: string
- name: flair
dtype: string
- name: content
dtype: string
- name: poster
dtype: string
- name: permalink
dtype: string
- name: id
dtype: string
- name: content_length
dtype: int64
- name: score
dtype: int64
- name: embedding
sequence: float64
splits:
- name: train
num_bytes: 122231779
num_examples: 9991
download_size: 48802673
dataset_size: 122231779
---
# Dataset Card for "processed-bestofredditorupdates"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DynamicSuperb/EnvironmentalSoundClassification_ESC50-InteriorAndDomesticSounds | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: label
dtype: string
- name: instruction
dtype: string
splits:
- name: test
num_bytes: 88265634.5
num_examples: 200
download_size: 69351719
dataset_size: 88265634.5
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "environmental_sound_classification_interior_and_domestic_sounds_ESC50"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gorovuha/ru-image-captioning | ---
dataset_info:
features:
- name: image
dtype: image
- name: 'Unnamed: 0'
dtype: int64
- name: capt1
dtype: string
- name: capt2
dtype: string
splits:
- name: train
num_bytes: 4476497267.352
num_examples: 1548
- name: validation
num_bytes: 993690435.0
num_examples: 373
- name: test
num_bytes: 3035520555.625
num_examples: 1189
download_size: 8449713600
dataset_size: 8505708257.977
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
NobodyExistsOnTheInternet/3kmathcot | ---
license: mit
---
|
zetavg/mlqa_en_zh_tw | ---
license: cc-by-3.0
task_categories:
- question-answering
- translation
language:
- zh
- en
size_categories:
- 1K<n<10K
pretty_name: MLQA en-zh_tw
---
[MLQA (MultiLingual Question Answering)](https://github.com/facebookresearch/mlqa) 中英雙語問答資料集,為原始 MLQA 資料集轉換為台灣正體中文的版本,並將中文與英語版本的相同項目合併,方便供雙語語言模型使用。(致謝:[BYVoid/OpenCC](https://github.com/BYVoid/OpenCC)、[vinta/pangu.js](https://github.com/vinta/pangu.js))
分為 `dev` 以及 `test` 兩個 split,各有 302 及 2986 組資料。
範本:
```json
[
{
"title": {
"en": "Curling at the 2014 Winter Olympics",
"zh_tw": "2014 年冬季奧林匹克運動會冰壺比賽"
},
"paragraphs": [
{
"context": {
"en": "Qualification to the curling tournaments at the Winter Olympics was determined through two methods. Nations could qualify teams by earning qualification points from performances at the 2012 and 2013 World Curling Championships. Teams could also qualify through an Olympic qualification event which was held in the autumn of 2013. Seven nations qualified teams via World Championship qualification points, while two nations qualified through the qualification event. As host nation, Russia qualified teams automatically, thus making a total of ten teams per gender in the curling tournaments.",
"zh_tw": "本屆冬奧會冰壺比賽參加資格有兩種辦法可以取得。各國家或地區可以透過 2012 年和 2013 年的世界冰壺錦標賽,也可以透過 2013 年 12 月舉辦的一次冬奧會資格賽來取得資格。七個國家透過兩屆世錦賽積分之和來獲得資格,兩個國家則透過冬奧會資格賽。作為主辦國,俄羅斯自動獲得參賽資格,這樣就確定了冬奧會冰壺比賽的男女各十支參賽隊伍。"
},
"qas": [
{
"id": "b08184972e38a79c47d01614aa08505bb3c9b680",
"question": {
"zh_tw": "俄羅斯有多少隊獲得參賽資格?",
"en": "How many teams did Russia qualify for?"
},
"answers": {
"en": [
{
"text": "ten teams",
"answer_start": 543
}
],
"zh_tw": [
{
"text": "十支",
"answer_start": 161
}
]
}
}
]
}
]
}
]
```
其餘資訊,詳見:https://github.com/facebookresearch/mlqa 。
## 原始資料集
https://github.com/facebookresearch/mlqa ,分別取其中 `dev` 與 `test` split 的 `context-zh-question-zh`、`context-zh-question-en`、`context-en-question-zh`,總共六個檔案。
## 轉換程序
1. 由 [OpenCC](https://github.com/BYVoid/OpenCC) 使用 `s2twp.json` 配置,將簡體中文轉換為台灣正體中文與臺灣常用詞彙。
2. 使用 Python 版本的 [pangu.js](https://github.com/vinta/pangu.js) 在中英文(全形與半形文字)之間加上空格。
3. 將中英文資料集中的相同項目進行合併。
關於轉換的詳細過程,請見:https://github.com/zetavg/LLM-Research/blob/bba5ff7/MLQA_Dataset_Converter_(en_zh_tw).ipynb 。
## 已知問題
* 有些項目的 `title`、`paragraph` 的 `context`、問題或是答案可能會缺少其中一種語言的版本。
* 部分問題與答案可能存在理解偏誤或歧異,例如上方所列範本「2014 年冬季奧林匹克運動會冰壺比賽」的問題「俄羅斯有多少隊獲得參賽資格?」與答案。
* `paragraph` 的 `context` 在不同語言的版本下可能長度與涵蓋的內容範圍有很大的落差。例如在 development split 中,`title` 為 “Adobe Photoshop” 的項目:
* `zh_tw` 只有兩句話:「Adobe Photoshop,簡稱 “PS”,是一個由 Adobe 開發和發行的影象處理軟體。該軟體釋出在 Windows 和 Mac OS 上。」
* 而 `en` 則是一個段落:“Adobe Photoshop is a raster graphics editor developed and published by Adobe Inc. for Windows and macOS. It was originally created in 1988 by Thomas and John Knoll. Since then, this software has become the industry standard not only in raster graphics editing, but in digital art as a whole. … (下略 127 字)” |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.