datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
diwank/scenario_instructor | ---
dataset_info:
features:
- name: query
sequence: string
- name: pos
sequence: string
- name: neg
sequence: string
splits:
- name: train
num_bytes: 14083675
num_examples: 14732
- name: test
num_bytes: 788355
num_examples: 819
- name: validation
num_bytes: 769580
num_examples: 818
download_size: 7159274
dataset_size: 15641610
---
# Dataset Card for "scenario_instructor"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kiringodhwani/msp8 | ---
dataset_info:
features:
- name: From
sequence: string
- name: Sent
sequence: string
- name: To
sequence: string
- name: Cc
sequence: string
- name: Subject
sequence: string
- name: Attachment
sequence: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 5451396
num_examples: 5348
download_size: 2135865
dataset_size: 5451396
---
# Dataset Card for "msp8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mitsuki-Sakamoto/reward-model-deberta-v3-large-v2-alpaca_farm-alpaca_gpt4_preference-preference_test | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
- name: old_preference
dtype: int64
splits:
- name: preference
num_bytes: 113541
num_examples: 194
download_size: 76166
dataset_size: 113541
configs:
- config_name: default
data_files:
- split: preference
path: data/preference-*
---
|
CVasNLPExperiments/Sample_test_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_10 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_with_openai_rices
num_bytes: 4266
num_examples: 10
download_size: 5331
dataset_size: 4266
---
# Dataset Card for "Sample_test_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arcee-ai/nuclear_patents | ---
dataset_info:
features:
- name: patent_number
dtype: string
- name: section
dtype: string
- name: raw_text
dtype: string
splits:
- name: train
num_bytes: 350035355.37046283
num_examples: 33523
- name: test
num_bytes: 38895137.62953716
num_examples: 3725
download_size: 151011439
dataset_size: 388930493.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "nuclear_patents"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JetQin/seven-wonders | ---
language:
- en
tags:
- seven-wonders
size_categories:
- 100K<n<1M
--- |
katielink/GuacaMol | ---
license: mit
tags:
- chemistry
- molecular design
---
# GuacaMol: Benchmarks for Molecular Design

For an in-depth explanation of the types of benchmarks and baseline scores,
please consult the paper
[Benchmarking Models for De Novo Molecular Design](https://arxiv.org/abs/1811.09621)
## Leaderboard
See [https://www.benevolent.com/guacamol](https://www.benevolent.com/guacamol).
|
ryan2009/ph | ---
license: openrail
---
|
mesolitica/pseudostreaming-malaysian-youtube-whisper-large-v3 | ---
license: mit
task_categories:
- automatic-speech-recognition
language:
- ms
---
# Pseudostreaming Malaysian Youtube videos using Whisper Large V3
Original dataset at https://huggingface.co/datasets/mesolitica/pseudolabel-malaysian-youtube-whisper-large-v3
We use https://huggingface.co/mesolitica/conformer-medium-mixed to generate pseudostreaming dataset, source code at https://github.com/mesolitica/malaysian-dataset/tree/master/speech-to-text-semisupervised/pseudostreaming-whisper
Total 40486.589364839296 hours.
data format from [processed.jsonl](processed.jsonl),
```json
[
{
"text": "dalam sukan olimpik dan paralimpik tokyo dua ribu dua puluh",
"start": 3.52,
"end": 6.46,
"audio_filename": "processed-audio/1-225586-0.mp3",
"original_audio_filename": "output-audio/3-1084-10.mp3"
},
{
"text": "to azizul has",
"start": 7.12,
"end": 8.179999999999998,
"audio_filename": "processed-audio/1-225586-1.mp3",
"original_audio_filename": "output-audio/3-1084-10.mp3"
},
{
"text": "awang meraih kilauan perak untuk malaysia dalam sukan olimpik tokyo dua ribu dua puluh tampil sebagai satu satunya wakil asia bagaimanapun beliau terpaksa akur di tangan pelumba great britain jason",
"start": 8.4,
"end": 22.98,
"audio_filename": "processed-audio/1-225586-2.mp3",
"original_audio_filename": "output-audio/3-1084-10.mp3"
},
{
"text": "y yang meraih pingat emas",
"start": 23.28,
"end": 25.060000000000002,
"audio_filename": "processed-audio/1-225586-3.mp3",
"original_audio_filename": "output-audio/3-1084-10.mp3"
}
]
```
## how-to
```bash
git clone https://huggingface.co/datasets/mesolitica/pseudostreaming-malaya-speech-stt
cd pseudostreaming-malaya-speech-stt
wget https://www.7-zip.org/a/7z2301-linux-x64.tar.xz
tar -xf 7z2301-linux-x64.tar.xz
./7zz x processed-audio.7z.001 -y -mmt40
``` |
autoevaluate/autoeval-eval-inverse-scaling__NeQA-inverse-scaling__NeQA-1e740e-1694759585 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- inverse-scaling/NeQA
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-2.7b_eval
metrics: []
dataset_name: inverse-scaling/NeQA
dataset_config: inverse-scaling--NeQA
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-2.7b_eval
* Dataset: inverse-scaling/NeQA
* Config: inverse-scaling--NeQA
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@MicPie](https://huggingface.co/MicPie) for evaluating this model. |
jjpetrisko/authentiface_v2.0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': fake
'1': real
splits:
- name: train
num_bytes: 1157058798.187
num_examples: 133567
- name: validation
num_bytes: 12237890754.551
num_examples: 19117
- name: test
num_bytes: 31235137663.783
num_examples: 38167
download_size: 8659498485
dataset_size: 44630087216.521
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
HausaNLP/Naija-Lex | ---
license: cc-by-nc-sa-4.0
tags:
- sentiment analysis, Twitter, tweets
- stopwords
multilinguality:
- monolingual
- multilingual
language:
- hau
- ibo
- yor
pretty_name: NaijaStopwords
---
# Naija-Lexicons
Naija-Lexicons is a part of the [Naija-Senti](https://huggingface.co/datasets/HausaNLP/NaijaSenti-Twitter) project. It is a list of collected stopwords from the four most widely spoken languages in Nigeria — Hausa, Igbo, Nigerian-Pidgin, and Yorùbá.
--------------------------------------------------------------------------------
## Dataset Description
- **Homepage:** https://github.com/hausanlp/NaijaSenti/tree/main/data/stopwords
- **Repository:** [GitHub](https://github.com/hausanlp/NaijaSenti/tree/main/data/stopwords)
- **Paper:** [NaijaSenti: A Nigerian Twitter Sentiment Corpus for Multilingual Sentiment Analysis](https://aclanthology.org/2022.lrec-1.63/)
- **Leaderboard:** N/A
- **Point of Contact:** [Shamsuddeen Hassan Muhammad](shamsuddeen2004@gmail.com)
### Languages
3 most indigenous Nigerian languages
* Hausa (hau)
* Igbo (ibo)
* Yoruba (yor)
## Dataset Structure
### Data Instances
List of lexicons instances in each of the 3 languages with their sentiment labels.
```
{
"word": "string",
"label": "string"
}
```
### How to use it
```python
from datasets import load_dataset
# you can load specific languages (e.g., Hausa). This download manually created and translated lexicons.
ds = load_dataset("HausaNLP/Naija-Lexicons", "hau")
# you can load specific languages (e.g., Hausa). You may also specify the split you want to downloaf
ds = load_dataset("HausaNLP/Naija-Lexicons", "hau", split = "manual")
```
## Additional Information
### Dataset Curators
* Shamsuddeen Hassan Muhammad
* Idris Abdulmumin
* Ibrahim Said Ahmad
* Bello Shehu Bello
### Licensing Information
This Naija-Lexicons dataset is licensed under a Creative Commons Attribution BY-NC-SA 4.0 International License
### Citation Information
```
@inproceedings{muhammad-etal-2022-naijasenti,
title = "{N}aija{S}enti: A {N}igerian {T}witter Sentiment Corpus for Multilingual Sentiment Analysis",
author = "Muhammad, Shamsuddeen Hassan and
Adelani, David Ifeoluwa and
Ruder, Sebastian and
Ahmad, Ibrahim Sa{'}id and
Abdulmumin, Idris and
Bello, Bello Shehu and
Choudhury, Monojit and
Emezue, Chris Chinenye and
Abdullahi, Saheed Salahudeen and
Aremu, Anuoluwapo and
Jorge, Al{\'\i}pio and
Brazdil, Pavel",
booktitle = "Proceedings of the Thirteenth Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.lrec-1.63",
pages = "590--602",
}
```
### Contributions
> This work was carried out with support from Lacuna Fund, an initiative co-founded by The Rockefeller Foundation, Google.org, and Canada’s International Development Research Centre. The views expressed herein do not necessarily represent those of Lacuna Fund, its Steering Committee, its funders, or Meridian Institute. |
gengyuanmax/WikiTiLo | ---
license: mit
---
|
mhhmm/typescript-instruct-20k-v2c | ---
license: cc
task_categories:
- text-generation
language:
- en
tags:
- typescript
- code-generation
- instruct-tuning
---
Why always Python?

I get 20,000 TypeScript code from [The Stack](https://huggingface.co/datasets/bigcode/the-stack-smol-xl) and generate {"instruction", "output"} pairs (based on gpt-3.5-turbo)
Using this dataset for finetune code generation model just for TypeScript
Make web developers great again ! |
Cesar7980/fingpt_chatglm2_sentiment_instruction_lora_ft_dataset | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 18540941.869938433
num_examples: 76772
download_size: 6417302
dataset_size: 18540941.869938433
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "fingpt_chatglm2_sentiment_instruction_lora_ft_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/clara_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of clara (Pokémon)
This is the dataset of clara (Pokémon), containing 500 images and their tags.
The core tags of this character are `pink_hair, mole, mole_under_mouth, bangs, breasts, pink_lips, bow, purple_eyes, eyeshadow, hairband, eyelashes, pink_eyeshadow, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 659.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clara_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 365.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clara_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1216 | 783.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clara_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 578.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clara_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1216 | 1.10 GiB | [Download](https://huggingface.co/datasets/CyberHarem/clara_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/clara_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, bracelet, collared_shirt, dynamax_band, makeup, mismatched_legwear, shorts, single_glove, smile, thighhighs, white_jacket, solo, one_eye_closed, partially_fingerless_gloves, hands_up, ring, fur_coat, looking_at_viewer, open_mouth |
| 1 | 10 |  |  |  |  |  | 1girl, bracelet, collared_shirt, holding_poke_ball, looking_at_viewer, shorts, single_glove, smile, thighhighs, dynamax_band, makeup, poke_ball_(basic), solo, white_jacket, mismatched_legwear, hands_up |
| 2 | 5 |  |  |  |  |  | 1girl, bracelet, collared_shirt, dynamax_band, full_body, hand_up, mismatched_legwear, shoes, shorts, single_glove, solo, standing, thighhighs, white_jacket, makeup, shaded_face, smile, white_background, blue_eyes, index_finger_raised, looking_at_viewer, ring, simple_background |
| 3 | 5 |  |  |  |  |  | 1girl, dynamax_band, looking_at_viewer, makeup, navel, nipples, single_glove, smile, solo, blue_eyes, mismatched_legwear, pussy, thighhighs, fur_coat, open_clothes, partially_fingerless_gloves, shiny_skin, anus, blush, collarbone, hair_bow, jacket, nude, open_mouth, spread_legs |
| 4 | 19 |  |  |  |  |  | 1girl, hetero, 1boy, blush, nipples, penis, open_mouth, sex, looking_at_viewer, solo_focus, sweat, vaginal, thighhighs, navel, smile, cum_in_pussy, pov, heart, mosaic_censoring, pubic_hair, spread_legs, makeup, shirt_lift, straddling, uncensored |
| 5 | 5 |  |  |  |  |  | 1boy, 1girl, hetero, ahegao, looking_back, open_mouth, patreon_username, penis, rolling_eyes, sex_from_behind, solo_focus, thighhighs, tongue_out, uncensored, blush, smile, testicles, web_address, anal, anus, asymmetrical_legwear, blue_eyes, cum, fucked_silly, hair_bow, huge_ass, nipples, nude, overflow, pussy, shiny, teeth, vaginal, watermark |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bracelet | collared_shirt | dynamax_band | makeup | mismatched_legwear | shorts | single_glove | smile | thighhighs | white_jacket | solo | one_eye_closed | partially_fingerless_gloves | hands_up | ring | fur_coat | looking_at_viewer | open_mouth | holding_poke_ball | poke_ball_(basic) | full_body | hand_up | shoes | standing | shaded_face | white_background | blue_eyes | index_finger_raised | simple_background | navel | nipples | pussy | open_clothes | shiny_skin | anus | blush | collarbone | hair_bow | jacket | nude | spread_legs | hetero | 1boy | penis | sex | solo_focus | sweat | vaginal | cum_in_pussy | pov | heart | mosaic_censoring | pubic_hair | shirt_lift | straddling | uncensored | ahegao | looking_back | patreon_username | rolling_eyes | sex_from_behind | tongue_out | testicles | web_address | anal | asymmetrical_legwear | cum | fucked_silly | huge_ass | overflow | shiny | teeth | watermark |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-----------------|:---------------|:---------|:---------------------|:---------|:---------------|:--------|:-------------|:---------------|:-------|:-----------------|:------------------------------|:-----------|:-------|:-----------|:--------------------|:-------------|:--------------------|:--------------------|:------------|:----------|:--------|:-----------|:--------------|:-------------------|:------------|:----------------------|:--------------------|:--------|:----------|:--------|:---------------|:-------------|:-------|:--------|:-------------|:-----------|:---------|:-------|:--------------|:---------|:-------|:--------|:------|:-------------|:--------|:----------|:---------------|:------|:--------|:-------------------|:-------------|:-------------|:-------------|:-------------|:---------|:---------------|:-------------------|:---------------|:------------------|:-------------|:------------|:--------------|:-------|:-----------------------|:------|:---------------|:-----------|:-----------|:--------|:--------|:------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | X | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | X | | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | X | X | | X | X | X | | X | | X | | | X | X | X | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 19 |  |  |  |  |  | X | | | | X | | | | X | X | | | | | | | | X | X | | | | | | | | | | | | X | X | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | | | | | X | X | | | | | | | | | X | | | | | | | | | X | | | | X | X | | | X | X | | X | | X | | X | X | X | | X | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
deeptigp/car_generation_diffusion_mini | ---
license: unknown
---
|
jamestalentium/xsum_1000_rm | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: output_text
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 2348532.740326889
num_examples: 1000
download_size: 830060
dataset_size: 2348532.740326889
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "xsum_1000_rm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_janhq__Mistral-7B-Instruct-v0.2-DARE | ---
pretty_name: Evaluation run of janhq/Mistral-7B-Instruct-v0.2-DARE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [janhq/Mistral-7B-Instruct-v0.2-DARE](https://huggingface.co/janhq/Mistral-7B-Instruct-v0.2-DARE)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_janhq__Mistral-7B-Instruct-v0.2-DARE\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-12T11:22:55.278603](https://huggingface.co/datasets/open-llm-leaderboard/details_janhq__Mistral-7B-Instruct-v0.2-DARE/blob/main/results_2023-12-12T11-22-55.278603.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5002671303432286,\n\
\ \"acc_stderr\": 0.03440023987934237,\n \"acc_norm\": 0.506255828811682,\n\
\ \"acc_norm_stderr\": 0.035162947112250174,\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.5435910325313378,\n\
\ \"mc2_stderr\": 0.015385871725485683\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5580204778156996,\n \"acc_stderr\": 0.014512682523128342,\n\
\ \"acc_norm\": 0.6194539249146758,\n \"acc_norm_stderr\": 0.014188277712349814\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5338577972515435,\n\
\ \"acc_stderr\": 0.0049783281907755245,\n \"acc_norm\": 0.7562238597888866,\n\
\ \"acc_norm_stderr\": 0.004284817238406704\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.03988903703336285,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.03988903703336285\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.569811320754717,\n \"acc_stderr\": 0.030471445867183235,\n\
\ \"acc_norm\": 0.569811320754717,\n \"acc_norm_stderr\": 0.030471445867183235\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n\
\ \"acc_stderr\": 0.041321250197233685,\n \"acc_norm\": 0.5763888888888888,\n\
\ \"acc_norm_stderr\": 0.041321250197233685\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.02497695405315526,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.02497695405315526\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.47419354838709676,\n \"acc_stderr\": 0.02840609505765332,\n \"\
acc_norm\": 0.47419354838709676,\n \"acc_norm_stderr\": 0.02840609505765332\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.35960591133004927,\n \"acc_stderr\": 0.03376458246509567,\n \"\
acc_norm\": 0.35960591133004927,\n \"acc_norm_stderr\": 0.03376458246509567\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.03524390844511784,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.03524390844511784\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.033586181457325226,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033586181457325226\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845436,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5153846153846153,\n \"acc_stderr\": 0.02533900301010651,\n \
\ \"acc_norm\": 0.5153846153846153,\n \"acc_norm_stderr\": 0.02533900301010651\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03242225027115006,\n \
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03242225027115006\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6935779816513762,\n\
\ \"acc_stderr\": 0.01976551722045852,\n \"acc_norm\": 0.6935779816513762,\n\
\ \"acc_norm_stderr\": 0.01976551722045852\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.35648148148148145,\n \"acc_stderr\": 0.03266478331527272,\n\
\ \"acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.03266478331527272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.39215686274509803,\n \"acc_stderr\": 0.03426712349247271,\n \"\
acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.03426712349247271\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.48945147679324896,\n \"acc_stderr\": 0.032539983791662855,\n \
\ \"acc_norm\": 0.48945147679324896,\n \"acc_norm_stderr\": 0.032539983791662855\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5739910313901345,\n\
\ \"acc_stderr\": 0.03318833286217281,\n \"acc_norm\": 0.5739910313901345,\n\
\ \"acc_norm_stderr\": 0.03318833286217281\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"\
acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5337423312883436,\n \"acc_stderr\": 0.03919415545048409,\n\
\ \"acc_norm\": 0.5337423312883436,\n \"acc_norm_stderr\": 0.03919415545048409\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161551,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161551\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.027236013946196687,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.027236013946196687\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6934865900383141,\n\
\ \"acc_stderr\": 0.016486952893041508,\n \"acc_norm\": 0.6934865900383141,\n\
\ \"acc_norm_stderr\": 0.016486952893041508\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.026897049996382875,\n\
\ \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.026897049996382875\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33743016759776534,\n\
\ \"acc_stderr\": 0.015813901283913048,\n \"acc_norm\": 0.33743016759776534,\n\
\ \"acc_norm_stderr\": 0.015813901283913048\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.028607893699576063,\n\
\ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.028607893699576063\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5434083601286174,\n\
\ \"acc_stderr\": 0.028290869054197604,\n \"acc_norm\": 0.5434083601286174,\n\
\ \"acc_norm_stderr\": 0.028290869054197604\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.027701228468542595,\n\
\ \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.027701228468542595\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35815602836879434,\n \"acc_stderr\": 0.02860208586275942,\n \
\ \"acc_norm\": 0.35815602836879434,\n \"acc_norm_stderr\": 0.02860208586275942\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.31747066492829207,\n\
\ \"acc_stderr\": 0.01188889206880931,\n \"acc_norm\": 0.31747066492829207,\n\
\ \"acc_norm_stderr\": 0.01188889206880931\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41911764705882354,\n \"acc_stderr\": 0.029972807170464626,\n\
\ \"acc_norm\": 0.41911764705882354,\n \"acc_norm_stderr\": 0.029972807170464626\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.49019607843137253,\n \"acc_stderr\": 0.020223946005074312,\n \
\ \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.020223946005074312\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.49387755102040815,\n \"acc_stderr\": 0.032006820201639086,\n\
\ \"acc_norm\": 0.49387755102040815,\n \"acc_norm_stderr\": 0.032006820201639086\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5124378109452736,\n\
\ \"acc_stderr\": 0.03534439848539579,\n \"acc_norm\": 0.5124378109452736,\n\
\ \"acc_norm_stderr\": 0.03534439848539579\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691584,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691584\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418194,\n \"mc2\": 0.5435910325313378,\n\
\ \"mc2_stderr\": 0.015385871725485683\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.749802683504341,\n \"acc_stderr\": 0.01217300964244914\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18119787717968158,\n \
\ \"acc_stderr\": 0.010609827611527357\n }\n}\n```"
repo_url: https://huggingface.co/janhq/Mistral-7B-Instruct-v0.2-DARE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|arc:challenge|25_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|gsm8k|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hellaswag|10_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T11-22-55.278603.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-12T11-22-55.278603.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- '**/details_harness|winogrande|5_2023-12-12T11-22-55.278603.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-12T11-22-55.278603.parquet'
- config_name: results
data_files:
- split: 2023_12_12T11_22_55.278603
path:
- results_2023-12-12T11-22-55.278603.parquet
- split: latest
path:
- results_2023-12-12T11-22-55.278603.parquet
---
# Dataset Card for Evaluation run of janhq/Mistral-7B-Instruct-v0.2-DARE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [janhq/Mistral-7B-Instruct-v0.2-DARE](https://huggingface.co/janhq/Mistral-7B-Instruct-v0.2-DARE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_janhq__Mistral-7B-Instruct-v0.2-DARE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-12T11:22:55.278603](https://huggingface.co/datasets/open-llm-leaderboard/details_janhq__Mistral-7B-Instruct-v0.2-DARE/blob/main/results_2023-12-12T11-22-55.278603.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5002671303432286,
"acc_stderr": 0.03440023987934237,
"acc_norm": 0.506255828811682,
"acc_norm_stderr": 0.035162947112250174,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.5435910325313378,
"mc2_stderr": 0.015385871725485683
},
"harness|arc:challenge|25": {
"acc": 0.5580204778156996,
"acc_stderr": 0.014512682523128342,
"acc_norm": 0.6194539249146758,
"acc_norm_stderr": 0.014188277712349814
},
"harness|hellaswag|10": {
"acc": 0.5338577972515435,
"acc_stderr": 0.0049783281907755245,
"acc_norm": 0.7562238597888866,
"acc_norm_stderr": 0.004284817238406704
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.03988903703336285,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.03988903703336285
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.569811320754717,
"acc_stderr": 0.030471445867183235,
"acc_norm": 0.569811320754717,
"acc_norm_stderr": 0.030471445867183235
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.041321250197233685,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.041321250197233685
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.02497695405315526,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.02497695405315526
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.47419354838709676,
"acc_stderr": 0.02840609505765332,
"acc_norm": 0.47419354838709676,
"acc_norm_stderr": 0.02840609505765332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35960591133004927,
"acc_stderr": 0.03376458246509567,
"acc_norm": 0.35960591133004927,
"acc_norm_stderr": 0.03376458246509567
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.03524390844511784,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.03524390844511784
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033586181457325226,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033586181457325226
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845436,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5153846153846153,
"acc_stderr": 0.02533900301010651,
"acc_norm": 0.5153846153846153,
"acc_norm_stderr": 0.02533900301010651
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03242225027115006,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03242225027115006
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6935779816513762,
"acc_stderr": 0.01976551722045852,
"acc_norm": 0.6935779816513762,
"acc_norm_stderr": 0.01976551722045852
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35648148148148145,
"acc_stderr": 0.03266478331527272,
"acc_norm": 0.35648148148148145,
"acc_norm_stderr": 0.03266478331527272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.03426712349247271,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.03426712349247271
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.48945147679324896,
"acc_stderr": 0.032539983791662855,
"acc_norm": 0.48945147679324896,
"acc_norm_stderr": 0.032539983791662855
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5739910313901345,
"acc_stderr": 0.03318833286217281,
"acc_norm": 0.5739910313901345,
"acc_norm_stderr": 0.03318833286217281
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5337423312883436,
"acc_stderr": 0.03919415545048409,
"acc_norm": 0.5337423312883436,
"acc_norm_stderr": 0.03919415545048409
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161551,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161551
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196687,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196687
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6934865900383141,
"acc_stderr": 0.016486952893041508,
"acc_norm": 0.6934865900383141,
"acc_norm_stderr": 0.016486952893041508
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.026897049996382875,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.026897049996382875
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33743016759776534,
"acc_stderr": 0.015813901283913048,
"acc_norm": 0.33743016759776534,
"acc_norm_stderr": 0.015813901283913048
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.028607893699576063,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.028607893699576063
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5434083601286174,
"acc_stderr": 0.028290869054197604,
"acc_norm": 0.5434083601286174,
"acc_norm_stderr": 0.028290869054197604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.027701228468542595,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.027701228468542595
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35815602836879434,
"acc_stderr": 0.02860208586275942,
"acc_norm": 0.35815602836879434,
"acc_norm_stderr": 0.02860208586275942
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.31747066492829207,
"acc_stderr": 0.01188889206880931,
"acc_norm": 0.31747066492829207,
"acc_norm_stderr": 0.01188889206880931
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41911764705882354,
"acc_stderr": 0.029972807170464626,
"acc_norm": 0.41911764705882354,
"acc_norm_stderr": 0.029972807170464626
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.020223946005074312,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.020223946005074312
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.49387755102040815,
"acc_stderr": 0.032006820201639086,
"acc_norm": 0.49387755102040815,
"acc_norm_stderr": 0.032006820201639086
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5124378109452736,
"acc_stderr": 0.03534439848539579,
"acc_norm": 0.5124378109452736,
"acc_norm_stderr": 0.03534439848539579
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691584,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691584
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418194,
"mc2": 0.5435910325313378,
"mc2_stderr": 0.015385871725485683
},
"harness|winogrande|5": {
"acc": 0.749802683504341,
"acc_stderr": 0.01217300964244914
},
"harness|gsm8k|5": {
"acc": 0.18119787717968158,
"acc_stderr": 0.010609827611527357
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
amaydle/npc-dialogue | ---
dataset_info:
features:
- name: Name
dtype: string
- name: Biography
dtype: string
- name: Query
dtype: string
- name: Response
dtype: string
- name: Emotion
dtype: string
splits:
- name: train
num_bytes: 737058.9117493472
num_examples: 1723
- name: test
num_bytes: 82133.08825065274
num_examples: 192
download_size: 201559
dataset_size: 819192.0
---
# Dataset Card for "npc-dialogue"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cjsanjay/sms-spam-collection-llama2-5k | ---
dataset_info:
features:
- name: v1
dtype: string
- name: v2
dtype: string
- name: 'Unnamed: 2'
dtype: string
- name: 'Unnamed: 3'
dtype: string
- name: 'Unnamed: 4'
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1252986
num_examples: 5572
download_size: 737014
dataset_size: 1252986
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gayanin/pubmed-abstracts-noised-with-kaggle-dist | ---
dataset_info:
- config_name: prob-01
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 18080692
num_examples: 74724
- name: test
num_bytes: 2316437
num_examples: 9341
- name: validation
num_bytes: 2380973
num_examples: 9341
download_size: 12750634
dataset_size: 22778102
- config_name: prob-02
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 17348001
num_examples: 74724
- name: test
num_bytes: 2221947
num_examples: 9341
- name: validation
num_bytes: 2284820
num_examples: 9341
download_size: 12451805
dataset_size: 21854768
- config_name: prob-03
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 16610860
num_examples: 74724
- name: test
num_bytes: 2128222
num_examples: 9341
- name: validation
num_bytes: 2185283
num_examples: 9341
download_size: 12122298
dataset_size: 20924365
- config_name: prob-04
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 15890091
num_examples: 74724
- name: test
num_bytes: 2031043
num_examples: 9341
- name: validation
num_bytes: 2091710
num_examples: 9341
download_size: 11751717
dataset_size: 20012844
- config_name: prob-05
features:
- name: refs
dtype: string
- name: trans
dtype: string
splits:
- name: train
num_bytes: 15156449
num_examples: 74724
- name: test
num_bytes: 1944482
num_examples: 9341
- name: validation
num_bytes: 1997171
num_examples: 9341
download_size: 11347983
dataset_size: 19098102
configs:
- config_name: prob-01
data_files:
- split: train
path: prob-01/train-*
- split: test
path: prob-01/test-*
- split: validation
path: prob-01/validation-*
- config_name: prob-02
data_files:
- split: train
path: prob-02/train-*
- split: test
path: prob-02/test-*
- split: validation
path: prob-02/validation-*
- config_name: prob-03
data_files:
- split: train
path: prob-03/train-*
- split: test
path: prob-03/test-*
- split: validation
path: prob-03/validation-*
- config_name: prob-04
data_files:
- split: train
path: prob-04/train-*
- split: test
path: prob-04/test-*
- split: validation
path: prob-04/validation-*
- config_name: prob-05
data_files:
- split: train
path: prob-05/train-*
- split: test
path: prob-05/test-*
- split: validation
path: prob-05/validation-*
---
|
open-llm-leaderboard/details_timpal0l__Mistral-7B-v0.1-flashback-v2 | ---
pretty_name: Evaluation run of timpal0l/Mistral-7B-v0.1-flashback-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [timpal0l/Mistral-7B-v0.1-flashback-v2](https://huggingface.co/timpal0l/Mistral-7B-v0.1-flashback-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_timpal0l__Mistral-7B-v0.1-flashback-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T15:10:27.393635](https://huggingface.co/datasets/open-llm-leaderboard/details_timpal0l__Mistral-7B-v0.1-flashback-v2/blob/main/results_2024-01-13T15-10-27.393635.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5963604853429504,\n\
\ \"acc_stderr\": 0.03312318488664919,\n \"acc_norm\": 0.6028320780728574,\n\
\ \"acc_norm_stderr\": 0.03381496123659357,\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326916,\n \"mc2\": 0.40658215292594935,\n\
\ \"mc2_stderr\": 0.014101721545122618\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.523037542662116,\n \"acc_stderr\": 0.014595873205358273,\n\
\ \"acc_norm\": 0.5716723549488054,\n \"acc_norm_stderr\": 0.014460496367599017\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6008763194582752,\n\
\ \"acc_stderr\": 0.004887174080003034,\n \"acc_norm\": 0.8074088826926907,\n\
\ \"acc_norm_stderr\": 0.003935286940315854\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464241,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464241\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6415094339622641,\n \"acc_stderr\": 0.029514703583981765,\n \
\ \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.029514703583981765\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.041307408795554966,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.041307408795554966\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159784,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159784\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n\
\ \"acc_stderr\": 0.025649381063029268,\n \"acc_norm\": 0.7161290322580646,\n\
\ \"acc_norm_stderr\": 0.025649381063029268\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713549,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713549\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164542,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164542\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.024939313906940798,\n\
\ \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.024939313906940798\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7724770642201835,\n \"acc_stderr\": 0.017974463578776502,\n \"\
acc_norm\": 0.7724770642201835,\n \"acc_norm_stderr\": 0.017974463578776502\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460305,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460305\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.038498560987940904,\n \"\
acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.038498560987940904\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.03512385283705049,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.03512385283705049\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.02559819368665226,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.02559819368665226\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.01498727064094601,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.01498727064094601\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n\
\ \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3039106145251397,\n\
\ \"acc_stderr\": 0.015382845587584524,\n \"acc_norm\": 0.3039106145251397,\n\
\ \"acc_norm_stderr\": 0.015382845587584524\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.02577311116963046,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.02577311116963046\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303055,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303055\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4322033898305085,\n\
\ \"acc_stderr\": 0.012652297777114968,\n \"acc_norm\": 0.4322033898305085,\n\
\ \"acc_norm_stderr\": 0.012652297777114968\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411952,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411952\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6274509803921569,\n \"acc_stderr\": 0.01955964680921593,\n \
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.01955964680921593\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.03889951252827217,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.03889951252827217\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533193,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533193\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326916,\n \"mc2\": 0.40658215292594935,\n\
\ \"mc2_stderr\": 0.014101721545122618\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663606\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2941622441243366,\n \
\ \"acc_stderr\": 0.012551285331470156\n }\n}\n```"
repo_url: https://huggingface.co/timpal0l/Mistral-7B-v0.1-flashback-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|arc:challenge|25_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|gsm8k|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hellaswag|10_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-10-27.393635.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T15-10-27.393635.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- '**/details_harness|winogrande|5_2024-01-13T15-10-27.393635.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T15-10-27.393635.parquet'
- config_name: results
data_files:
- split: 2024_01_13T15_10_27.393635
path:
- results_2024-01-13T15-10-27.393635.parquet
- split: latest
path:
- results_2024-01-13T15-10-27.393635.parquet
---
# Dataset Card for Evaluation run of timpal0l/Mistral-7B-v0.1-flashback-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [timpal0l/Mistral-7B-v0.1-flashback-v2](https://huggingface.co/timpal0l/Mistral-7B-v0.1-flashback-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_timpal0l__Mistral-7B-v0.1-flashback-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T15:10:27.393635](https://huggingface.co/datasets/open-llm-leaderboard/details_timpal0l__Mistral-7B-v0.1-flashback-v2/blob/main/results_2024-01-13T15-10-27.393635.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5963604853429504,
"acc_stderr": 0.03312318488664919,
"acc_norm": 0.6028320780728574,
"acc_norm_stderr": 0.03381496123659357,
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326916,
"mc2": 0.40658215292594935,
"mc2_stderr": 0.014101721545122618
},
"harness|arc:challenge|25": {
"acc": 0.523037542662116,
"acc_stderr": 0.014595873205358273,
"acc_norm": 0.5716723549488054,
"acc_norm_stderr": 0.014460496367599017
},
"harness|hellaswag|10": {
"acc": 0.6008763194582752,
"acc_stderr": 0.004887174080003034,
"acc_norm": 0.8074088826926907,
"acc_norm_stderr": 0.003935286940315854
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464241,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464241
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6415094339622641,
"acc_stderr": 0.029514703583981765,
"acc_norm": 0.6415094339622641,
"acc_norm_stderr": 0.029514703583981765
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.041307408795554966,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.041307408795554966
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159784,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159784
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.025649381063029268,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.025649381063029268
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713549,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713549
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164542,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.024939313906940798,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.024939313906940798
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7724770642201835,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.7724770642201835,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460305,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460305
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.038498560987940904,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.038498560987940904
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.03512385283705049,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.03512385283705049
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.02559819368665226,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.02559819368665226
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.01498727064094601,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.01498727064094601
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3039106145251397,
"acc_stderr": 0.015382845587584524,
"acc_norm": 0.3039106145251397,
"acc_norm_stderr": 0.015382845587584524
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.02577311116963046,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.02577311116963046
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4322033898305085,
"acc_stderr": 0.012652297777114968,
"acc_norm": 0.4322033898305085,
"acc_norm_stderr": 0.012652297777114968
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411952,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.01955964680921593,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.01955964680921593
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.03889951252827217,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.03889951252827217
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533193,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326916,
"mc2": 0.40658215292594935,
"mc2_stderr": 0.014101721545122618
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.011793015817663606
},
"harness|gsm8k|5": {
"acc": 0.2941622441243366,
"acc_stderr": 0.012551285331470156
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DanielSongShen/CLIP-food101-image-dataset-med_latents_hidden_states | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': apple_pie
'1': baby_back_ribs
'2': baklava
'3': beef_carpaccio
'4': beef_tartare
'5': beet_salad
'6': beignets
'7': bibimbap
'8': bread_pudding
'9': breakfast_burrito
'10': bruschetta
'11': caesar_salad
'12': cannoli
'13': caprese_salad
'14': carrot_cake
'15': ceviche
'16': cheesecake
'17': cheese_plate
'18': chicken_curry
'19': chicken_quesadilla
'20': chicken_wings
'21': chocolate_cake
'22': chocolate_mousse
'23': churros
'24': clam_chowder
'25': club_sandwich
'26': crab_cakes
'27': creme_brulee
'28': croque_madame
'29': cup_cakes
'30': deviled_eggs
'31': donuts
'32': dumplings
'33': edamame
'34': eggs_benedict
'35': escargots
'36': falafel
'37': filet_mignon
'38': fish_and_chips
'39': foie_gras
'40': french_fries
'41': french_onion_soup
'42': french_toast
'43': fried_calamari
'44': fried_rice
'45': frozen_yogurt
'46': garlic_bread
'47': gnocchi
'48': greek_salad
'49': grilled_cheese_sandwich
'50': grilled_salmon
'51': guacamole
'52': gyoza
'53': hamburger
'54': hot_and_sour_soup
'55': hot_dog
'56': huevos_rancheros
'57': hummus
'58': ice_cream
'59': lasagna
'60': lobster_bisque
'61': lobster_roll_sandwich
'62': macaroni_and_cheese
'63': macarons
'64': miso_soup
'65': mussels
'66': nachos
'67': omelette
'68': onion_rings
'69': oysters
'70': pad_thai
'71': paella
'72': pancakes
'73': panna_cotta
'74': peking_duck
'75': pho
'76': pizza
'77': pork_chop
'78': poutine
'79': prime_rib
'80': pulled_pork_sandwich
'81': ramen
'82': ravioli
'83': red_velvet_cake
'84': risotto
'85': samosa
'86': sashimi
'87': scallops
'88': seaweed_salad
'89': shrimp_and_grits
'90': spaghetti_bolognese
'91': spaghetti_carbonara
'92': spring_rolls
'93': steak
'94': strawberry_shortcake
'95': sushi
'96': tacos
'97': takoyaki
'98': tiramisu
'99': tuna_tartare
'100': waffles
- name: CLIP_image_latent
sequence:
sequence: float32
- name: CLIP_hidden_states
sequence:
sequence: float32
splits:
- name: train
num_bytes: 1360302794.0
num_examples: 1000
download_size: 1369715072
dataset_size: 1360302794.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AlbHugUser/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4201526
num_examples: 1000
download_size: 2247084
dataset_size: 4201526
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
um-ids/diamond-kg | ---
license: cc0-1.0
---
|
ramnika003/autotrain-data-sentiment_analysis_project | ---
task_categories:
- text-classification
---
# AutoTrain Dataset for project: sentiment_analysis_project
## Dataset Descritpion
This dataset has been automatically processed by AutoTrain for project sentiment_analysis_project.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "Realizing that I don`t have school today... or tomorrow... or for the next few months. I really nee[...]",
"target": 1
},
{
"text": "Good morning tweeps. Busy this a.m. but not in a working way",
"target": 2
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(num_classes=3, names=['negative', 'neutral', 'positive'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 16180 |
| valid | 4047 |
|
liuyanchen1015/MULTI_VALUE_qqp_drop_aux_be_progressive | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 270824
num_examples: 1553
- name: test
num_bytes: 2461834
num_examples: 14195
- name: train
num_bytes: 2398053
num_examples: 13719
download_size: 3124623
dataset_size: 5130711
---
# Dataset Card for "MULTI_VALUE_qqp_drop_aux_be_progressive"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ArkLade/housemix1 | ---
license: openrail
task_categories:
- zero-shot-classification
pretty_name: tiny_demo
size_categories:
- n<1K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
heliosprime/twitter_dataset_1713211939 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 25710
num_examples: 69
download_size: 21621
dataset_size: 25710
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713211939"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_BFauber__opt125m_10e6_run1 | ---
pretty_name: Evaluation run of BFauber/opt125m_10e6_run1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/opt125m_10e6_run1](https://huggingface.co/BFauber/opt125m_10e6_run1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt125m_10e6_run1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T18:19:34.951673](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e6_run1/blob/main/results_2024-02-02T18-19-34.951673.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2453956177453566,\n\
\ \"acc_stderr\": 0.03035774790592599,\n \"acc_norm\": 0.24574841257866145,\n\
\ \"acc_norm_stderr\": 0.031160600953299776,\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.01510240479735965,\n \"mc2\": 0.48593837171548643,\n\
\ \"mc2_stderr\": 0.01578462194827542\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2090443686006826,\n \"acc_stderr\": 0.011882746987406455,\n\
\ \"acc_norm\": 0.23976109215017063,\n \"acc_norm_stderr\": 0.012476304127453956\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27693686516630156,\n\
\ \"acc_stderr\": 0.00446570481089354,\n \"acc_norm\": 0.29794861581358295,\n\
\ \"acc_norm_stderr\": 0.004564220870531578\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677088,\n\
\ \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677088\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23018867924528302,\n \"acc_stderr\": 0.025907897122408173,\n\
\ \"acc_norm\": 0.23018867924528302,\n \"acc_norm_stderr\": 0.025907897122408173\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483099,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483099\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.13725490196078433,\n \"acc_stderr\": 0.034240846698915216,\n\
\ \"acc_norm\": 0.13725490196078433,\n \"acc_norm_stderr\": 0.034240846698915216\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2127659574468085,\n \"acc_stderr\": 0.026754391348039776,\n\
\ \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.026754391348039776\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653695\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n\
\ \"acc_stderr\": 0.024790118459332215,\n \"acc_norm\": 0.25483870967741934,\n\
\ \"acc_norm_stderr\": 0.024790118459332215\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2474747474747475,\n \"acc_stderr\": 0.030746300742124488,\n \"\
acc_norm\": 0.2474747474747475,\n \"acc_norm_stderr\": 0.030746300742124488\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.02977866303775296,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.02977866303775296\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.22564102564102564,\n \"acc_stderr\": 0.021193632525148547,\n\
\ \"acc_norm\": 0.22564102564102564,\n \"acc_norm_stderr\": 0.021193632525148547\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.026265024608275886,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.026265024608275886\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.21834862385321102,\n\
\ \"acc_stderr\": 0.017712600528722734,\n \"acc_norm\": 0.21834862385321102,\n\
\ \"acc_norm_stderr\": 0.017712600528722734\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.1712962962962963,\n \"acc_stderr\": 0.025695341643824685,\n\
\ \"acc_norm\": 0.1712962962962963,\n \"acc_norm_stderr\": 0.025695341643824685\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27941176470588236,\n \"acc_stderr\": 0.031493281045079556,\n \"\
acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.031493281045079556\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598025,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598025\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2062780269058296,\n\
\ \"acc_stderr\": 0.027157150479563824,\n \"acc_norm\": 0.2062780269058296,\n\
\ \"acc_norm_stderr\": 0.027157150479563824\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.183206106870229,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.183206106870229,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"\
acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3055555555555556,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.3055555555555556,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n\
\ \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.28205128205128205,\n\
\ \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26947637292464877,\n\
\ \"acc_stderr\": 0.015866243073215054,\n \"acc_norm\": 0.26947637292464877,\n\
\ \"acc_norm_stderr\": 0.015866243073215054\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.28901734104046245,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.28901734104046245,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n\
\ \"acc_stderr\": 0.025494259350694888,\n \"acc_norm\": 0.2797427652733119,\n\
\ \"acc_norm_stderr\": 0.025494259350694888\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.02301670564026219,\n\
\ \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.02301670564026219\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2127659574468085,\n \"acc_stderr\": 0.024414612974307713,\n \
\ \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.024414612974307713\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25358539765319427,\n\
\ \"acc_stderr\": 0.011111715336101143,\n \"acc_norm\": 0.25358539765319427,\n\
\ \"acc_norm_stderr\": 0.011111715336101143\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.15441176470588236,\n \"acc_stderr\": 0.021950024722922026,\n\
\ \"acc_norm\": 0.15441176470588236,\n \"acc_norm_stderr\": 0.021950024722922026\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.18181818181818182,\n \"acc_stderr\": 0.036942843353378,\n\
\ \"acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.036942843353378\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24081632653061225,\n\
\ \"acc_stderr\": 0.027372942201788163,\n \"acc_norm\": 0.24081632653061225,\n\
\ \"acc_norm_stderr\": 0.027372942201788163\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.2835820895522388,\n \"acc_stderr\": 0.031871875379197966,\n\
\ \"acc_norm\": 0.2835820895522388,\n \"acc_norm_stderr\": 0.031871875379197966\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.1927710843373494,\n \"acc_stderr\": 0.03070982405056527,\n\
\ \"acc_norm\": 0.1927710843373494,\n \"acc_norm_stderr\": 0.03070982405056527\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n\
\ \"acc_stderr\": 0.034886477134579215,\n \"acc_norm\": 0.29239766081871343,\n\
\ \"acc_norm_stderr\": 0.034886477134579215\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.24724602203182375,\n \"mc1_stderr\": 0.01510240479735965,\n\
\ \"mc2\": 0.48593837171548643,\n \"mc2_stderr\": 0.01578462194827542\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.5217048145224941,\n\
\ \"acc_stderr\": 0.014039239216484627\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/opt125m_10e6_run1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|arc:challenge|25_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|gsm8k|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hellaswag|10_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-19-34.951673.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T18-19-34.951673.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- '**/details_harness|winogrande|5_2024-02-02T18-19-34.951673.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T18-19-34.951673.parquet'
- config_name: results
data_files:
- split: 2024_02_02T18_19_34.951673
path:
- results_2024-02-02T18-19-34.951673.parquet
- split: latest
path:
- results_2024-02-02T18-19-34.951673.parquet
---
# Dataset Card for Evaluation run of BFauber/opt125m_10e6_run1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e6_run1](https://huggingface.co/BFauber/opt125m_10e6_run1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt125m_10e6_run1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T18:19:34.951673](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e6_run1/blob/main/results_2024-02-02T18-19-34.951673.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2453956177453566,
"acc_stderr": 0.03035774790592599,
"acc_norm": 0.24574841257866145,
"acc_norm_stderr": 0.031160600953299776,
"mc1": 0.24724602203182375,
"mc1_stderr": 0.01510240479735965,
"mc2": 0.48593837171548643,
"mc2_stderr": 0.01578462194827542
},
"harness|arc:challenge|25": {
"acc": 0.2090443686006826,
"acc_stderr": 0.011882746987406455,
"acc_norm": 0.23976109215017063,
"acc_norm_stderr": 0.012476304127453956
},
"harness|hellaswag|10": {
"acc": 0.27693686516630156,
"acc_stderr": 0.00446570481089354,
"acc_norm": 0.29794861581358295,
"acc_norm_stderr": 0.004564220870531578
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677088,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677088
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23018867924528302,
"acc_stderr": 0.025907897122408173,
"acc_norm": 0.23018867924528302,
"acc_norm_stderr": 0.025907897122408173
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483099,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483099
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.13725490196078433,
"acc_stderr": 0.034240846698915216,
"acc_norm": 0.13725490196078433,
"acc_norm_stderr": 0.034240846698915216
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.026754391348039776,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.026754391348039776
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332215,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332215
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358609,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358609
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2474747474747475,
"acc_stderr": 0.030746300742124488,
"acc_norm": 0.2474747474747475,
"acc_norm_stderr": 0.030746300742124488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.02977866303775296,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.02977866303775296
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.22564102564102564,
"acc_stderr": 0.021193632525148547,
"acc_norm": 0.22564102564102564,
"acc_norm_stderr": 0.021193632525148547
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.026265024608275886,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.026265024608275886
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21834862385321102,
"acc_stderr": 0.017712600528722734,
"acc_norm": 0.21834862385321102,
"acc_norm_stderr": 0.017712600528722734
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1712962962962963,
"acc_stderr": 0.025695341643824685,
"acc_norm": 0.1712962962962963,
"acc_norm_stderr": 0.025695341643824685
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598025,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598025
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2062780269058296,
"acc_stderr": 0.027157150479563824,
"acc_norm": 0.2062780269058296,
"acc_norm_stderr": 0.027157150479563824
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.183206106870229,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.183206106870229,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26947637292464877,
"acc_stderr": 0.015866243073215054,
"acc_norm": 0.26947637292464877,
"acc_norm_stderr": 0.015866243073215054
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.025494259350694888,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.025494259350694888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.02301670564026219,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.02301670564026219
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.024414612974307713,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.024414612974307713
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25358539765319427,
"acc_stderr": 0.011111715336101143,
"acc_norm": 0.25358539765319427,
"acc_norm_stderr": 0.011111715336101143
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.15441176470588236,
"acc_stderr": 0.021950024722922026,
"acc_norm": 0.15441176470588236,
"acc_norm_stderr": 0.021950024722922026
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.036942843353378,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.036942843353378
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2835820895522388,
"acc_stderr": 0.031871875379197966,
"acc_norm": 0.2835820895522388,
"acc_norm_stderr": 0.031871875379197966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1927710843373494,
"acc_stderr": 0.03070982405056527,
"acc_norm": 0.1927710843373494,
"acc_norm_stderr": 0.03070982405056527
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24724602203182375,
"mc1_stderr": 0.01510240479735965,
"mc2": 0.48593837171548643,
"mc2_stderr": 0.01578462194827542
},
"harness|winogrande|5": {
"acc": 0.5217048145224941,
"acc_stderr": 0.014039239216484627
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
nateraw/snares | ---
language: en
license: other
---
# Snares
FSD50K subset of just snares.
```
wget -nc https://huggingface.co/datasets/nateraw/snares/resolve/main/snares.csv
wget -nc https://huggingface.co/datasets/nateraw/snares/resolve/main/snares.zip
unzip snares.zip
```
If you unpack as described above, `snares.csv` will have correct filepath to audio file when loaded in as CSV. Here we show with pandas...
```python
import pandas as pd
df = pd.read_csv('snares.csv')
``` |
cafbr/sample-hf-github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: labels_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: description
dtype: string
- name: creator
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: open_issues
dtype: int64
- name: closed_issues
dtype: int64
- name: state
dtype: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: due_on
dtype: 'null'
- name: closed_at
dtype: 'null'
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 7677601
num_examples: 1000
download_size: 2120805
dataset_size: 7677601
---
# Dataset Card for "sample-hf-github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dotan1111/MSA-nuc-6-seq | ---
tags:
- sequence-to-sequence
- bioinformatics
- biology
---
# Multiple Sequence Alignment as a Sequence-to-Sequence Learning Problem
## Abstract:
The sequence alignment problem is one of the most fundamental problems in bioinformatics and a plethora of methods were devised to tackle it. Here we introduce BetaAlign, a methodology for aligning sequences using an NLP approach. BetaAlign accounts for the possible variability of the evolutionary process among different datasets by using an ensemble of transformers, each trained on millions of samples generated from a different evolutionary model. Our approach leads to alignment accuracy that is similar and often better than commonly used methods, such as MAFFT, DIALIGN, ClustalW, T-Coffee, PRANK, and MUSCLE.

An illustration of aligning sequences with sequence-to-sequence learning. (a) Consider two input sequences "AAG" and "ACGG". (b) The result of encoding the unaligned sequences into the source language (*Concat* representation). (c) The sentence from the source language is translated to the target language via a transformer model. (d) The translated sentence in the target language (*Spaces* representation). (e) The resulting alignment, decoded from the translated sentence, in which "AA-G" is aligned to "ACGG". The transformer architecture illustration is adapted from (Vaswani et al., 2017).
## Data:
We used SpartaABC (Loewenthal et al., 2021) to generate millions of true alignments. SpartaABC requires the following input: (1) a rooted phylogenetic tree, which includes a topology and branch lengths; (2) a substitution model (amino acids or nucleotides); (3) root sequence length; (4) the indel model parameters, which include: insertion rate (*R_I*), deletion rate (*R_D*), a parameter for the insertion Zipfian distribution (*A_I*), and a parameter for the deletion Zipfian distribution (*A_D*). MSAs were simulated along random phylogenetic tree topologies generated using the program ETE version 3.0 (Huerta-Cepas et al., 2016) with default parameters.
We generated 1,495,000, 2,000 and 3,000, protein MSAs with ten sequences that were used as training validation and testing data, respectively. We generated the same number of DNA MSAs. For each random tree, branch lengths were drawn from a uniform distribution in the range *(0.5,1.0)*. Next, the sequences were generated using SpartaABC with the following parameters: *R_I,R_D \in (0.0,0.05)*, *A_I, A_D \in (1.01,2.0)*. The alignment lengths as well as the sequence lengths of the tree leaves vary within and among datasets as they depend on the indel dynamics and the root length. The root length was sampled uniformly in the range *[32,44]*. Unless stated otherwise, all protein datasets were generated with the WAG+G model, and all DNA datasets were generated with the GTR+G model, with the following parameters: (1) frequencies for the different nucleotides *(0.37, 0.166, 0.307, 0.158)*, in the order "T", "C", "A" and "G"; (2) with the substitutions rate *(0.444, 0.0843, 0.116, 0.107, 0.00027)*, in the order "a", "b", "c", "d", and "e" for the substitution matrix.
## Example:
The following example correspond for the illustrated MSA in the figure above:
{"MSA": "AAAC-GGG", "unaligned_seqs": {"seq0": "AAG", "seq1": "ACGG"}}
## APA
```
Dotan, E., Belinkov, Y., Avram, O., Wygoda, E., Ecker, N., Alburquerque, M., Keren, O., Loewenthal, G., & Pupko T. (2023). Multiple sequence alignment as a sequence-to-sequence learning problem. The Eleventh International Conference on Learning Representations (ICLR 2023).
```
## BibTeX
```
@article{Dotan_multiple_2023,
author = {Dotan, Edo and Belinkov, Yonatan and Avram, Oren and Wygoda, Elya and Ecker, Noa and Alburquerque, Michael and Keren, Omri and Loewenthal, Gil and Pupko, Tal},
month = aug,
title = {{Multiple sequence alignment as a sequence-to-sequence learning problem}},
year = {2023}
}
``` |
johannes-garstenauer/ENN_class_embeddings_dim_1 | ---
dataset_info:
features:
- name: last_hs
sequence: float32
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1076352
num_examples: 67272
download_size: 400578
dataset_size: 1076352
---
# Dataset Card for "ENN_class_embeddings_dim_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
geeknaren/audio-dataset | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 468529.0
num_examples: 1
- name: validation
num_bytes: 468529.0
num_examples: 1
download_size: 939388
dataset_size: 937058.0
---
# Dataset Card for "audio-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GustavoMilena/Sa-en | ---
license: mit
---
|
distilled-from-one-sec-cv12/chunk_187 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 789799404
num_examples: 153897
download_size: 804462546
dataset_size: 789799404
---
# Dataset Card for "chunk_187"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mkshing/assets | ---
license: mit
---
|
whizystems/synthdog-hu | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 3976426.0
num_examples: 41
- name: validation
num_bytes: 481072.0
num_examples: 4
- name: test
num_bytes: 436810.0
num_examples: 5
download_size: 4832948
dataset_size: 4894308.0
---
|
ouob/hakkadict_210430_sixian | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: text_thrs
dtype: string
- name: path
dtype:
audio:
sampling_rate: 22000
splits:
- name: train
num_bytes: 340801105.128
num_examples: 15263
download_size: 335215646
dataset_size: 340801105.128
---
# Dataset Card for "hakkadict"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sparse-generative-ai/results | ---
license: apache-2.0
---
|
autoevaluate/autoeval-staging-eval-project-multi_news-416d7689-12805701 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- multi_news
eval_info:
task: summarization
model: datien228/distilbart-cnn-12-6-ftn-multi_news
metrics: []
dataset_name: multi_news
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: datien228/distilbart-cnn-12-6-ftn-multi_news
* Dataset: multi_news
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@ccdv](https://huggingface.co/ccdv) for evaluating this model. |
anan-2024/twitter_dataset_1713171309 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 26374
num_examples: 62
download_size: 13731
dataset_size: 26374
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MegPaulson/ISIC_Melanoma | ---
dataset_info:
features:
- name: image
dtype: image
- name: image_seg
dtype: image
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 55448028.0
num_examples: 438
download_size: 54990564
dataset_size: 55448028.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ISIC_Melanoma"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_tourist800__Marcoro14-7B-slerp | ---
pretty_name: Evaluation run of tourist800/Marcoro14-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [tourist800/Marcoro14-7B-slerp](https://huggingface.co/tourist800/Marcoro14-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tourist800__Marcoro14-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-28T18:32:24.206889](https://huggingface.co/datasets/open-llm-leaderboard/details_tourist800__Marcoro14-7B-slerp/blob/main/results_2024-01-28T18-32-24.206889.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6111388777526852,\n\
\ \"acc_stderr\": 0.03287383799644916,\n \"acc_norm\": 0.6159662135212005,\n\
\ \"acc_norm_stderr\": 0.03353760847602086,\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5207873423930568,\n\
\ \"mc2_stderr\": 0.0153889376471881\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5861774744027304,\n \"acc_stderr\": 0.014392730009221004,\n\
\ \"acc_norm\": 0.6339590443686007,\n \"acc_norm_stderr\": 0.014077223108470137\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6421031666998606,\n\
\ \"acc_stderr\": 0.004784018497679814,\n \"acc_norm\": 0.8376817367058355,\n\
\ \"acc_norm_stderr\": 0.0036798891253998155\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.03268335899936337,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.03268335899936337\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.0437588849272706,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.0437588849272706\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5741935483870968,\n \"acc_stderr\": 0.028129112709165897,\n \"\
acc_norm\": 0.5741935483870968,\n \"acc_norm_stderr\": 0.028129112709165897\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.025069094387296532,\n\
\ \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.025069094387296532\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217902,\n \"\
acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217902\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n\
\ \"acc_stderr\": 0.014214138556913915,\n \"acc_norm\": 0.8033205619412516,\n\
\ \"acc_norm_stderr\": 0.014214138556913915\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.024883140570071762,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.024883140570071762\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41675977653631285,\n\
\ \"acc_stderr\": 0.016489134962438954,\n \"acc_norm\": 0.41675977653631285,\n\
\ \"acc_norm_stderr\": 0.016489134962438954\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236848,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4406779661016949,\n\
\ \"acc_stderr\": 0.012680037994097074,\n \"acc_norm\": 0.4406779661016949,\n\
\ \"acc_norm_stderr\": 0.012680037994097074\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6372549019607843,\n \"acc_stderr\": 0.019450768432505518,\n \
\ \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.019450768432505518\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6019900497512438,\n\
\ \"acc_stderr\": 0.034611994290400135,\n \"acc_norm\": 0.6019900497512438,\n\
\ \"acc_norm_stderr\": 0.034611994290400135\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.5207873423930568,\n\
\ \"mc2_stderr\": 0.0153889376471881\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.01166122363764341\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40181956027293403,\n \
\ \"acc_stderr\": 0.01350435778749403\n }\n}\n```"
repo_url: https://huggingface.co/tourist800/Marcoro14-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|arc:challenge|25_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|gsm8k|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hellaswag|10_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T18-32-24.206889.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T18-32-24.206889.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- '**/details_harness|winogrande|5_2024-01-28T18-32-24.206889.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-28T18-32-24.206889.parquet'
- config_name: results
data_files:
- split: 2024_01_28T18_32_24.206889
path:
- results_2024-01-28T18-32-24.206889.parquet
- split: latest
path:
- results_2024-01-28T18-32-24.206889.parquet
---
# Dataset Card for Evaluation run of tourist800/Marcoro14-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [tourist800/Marcoro14-7B-slerp](https://huggingface.co/tourist800/Marcoro14-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tourist800__Marcoro14-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-28T18:32:24.206889](https://huggingface.co/datasets/open-llm-leaderboard/details_tourist800__Marcoro14-7B-slerp/blob/main/results_2024-01-28T18-32-24.206889.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6111388777526852,
"acc_stderr": 0.03287383799644916,
"acc_norm": 0.6159662135212005,
"acc_norm_stderr": 0.03353760847602086,
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.5207873423930568,
"mc2_stderr": 0.0153889376471881
},
"harness|arc:challenge|25": {
"acc": 0.5861774744027304,
"acc_stderr": 0.014392730009221004,
"acc_norm": 0.6339590443686007,
"acc_norm_stderr": 0.014077223108470137
},
"harness|hellaswag|10": {
"acc": 0.6421031666998606,
"acc_stderr": 0.004784018497679814,
"acc_norm": 0.8376817367058355,
"acc_norm_stderr": 0.0036798891253998155
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.03268335899936337,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.03268335899936337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055266,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.0437588849272706,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.0437588849272706
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5741935483870968,
"acc_stderr": 0.028129112709165897,
"acc_norm": 0.5741935483870968,
"acc_norm_stderr": 0.028129112709165897
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.025069094387296532,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.025069094387296532
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815632,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815632
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.017090573804217902,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.017090573804217902
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.02616056824660146,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.02616056824660146
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8033205619412516,
"acc_stderr": 0.014214138556913915,
"acc_norm": 0.8033205619412516,
"acc_norm_stderr": 0.014214138556913915
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.024883140570071762,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.024883140570071762
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41675977653631285,
"acc_stderr": 0.016489134962438954,
"acc_norm": 0.41675977653631285,
"acc_norm_stderr": 0.016489134962438954
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195455,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195455
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236848,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4406779661016949,
"acc_stderr": 0.012680037994097074,
"acc_norm": 0.4406779661016949,
"acc_norm_stderr": 0.012680037994097074
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.019450768432505518,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.019450768432505518
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6019900497512438,
"acc_stderr": 0.034611994290400135,
"acc_norm": 0.6019900497512438,
"acc_norm_stderr": 0.034611994290400135
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.5207873423930568,
"mc2_stderr": 0.0153889376471881
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.01166122363764341
},
"harness|gsm8k|5": {
"acc": 0.40181956027293403,
"acc_stderr": 0.01350435778749403
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Bingsu/national_library_of_korea_book_info | ---
language:
- ko
license:
- other
multilinguality:
- monolingual
pretty_name: national_library_of_korea_book_info
size_categories:
- 1M<n<10M
---
# national_library_of_korea_book_info
## Dataset Description
- **Homepage** [문화 빅데이터 플랫폼](https://www.culture.go.kr/bigdata/user/data_market/detail.do?id=63513d7b-9b87-4ec1-a398-0a18ecc45411)
- **Download Size** 759 MB
- **Generated Size** 2.33 GB
- **Total Size** 3.09 GB
국립중앙도서관에서 배포한, 국립중앙도서관에서 보관중인 도서 정보에 관한 데이터.
### License
other ([KOGL](https://www.kogl.or.kr/info/license.do#05-tab) (Korea Open Government License) Type-1)

- According to above KOGL, user can use public works freely and without fee regardless of its commercial use, and can change or modify to create secondary works when user complies with the terms provided as follows:
<details>
<summary>KOGL Type 1</summary>
1. Source Indication Liability
- Users who use public works shall indicate source or copyright as follows:
- EX : “000(public institution's name)'s public work is used according to KOGL”
- The link shall be provided when online hyperlink for the source website is available.
- Marking shall not be used to misguide the third party that the user is sponsored by public institution or user has a special relationship with public institutions.
2. Use Prohibited Information
- Personal information that is protected by Personal Information Protection Act, Promotion for Information Network Use and Information Protection Act, etc.
- Credit information protected by the Use and Protection of Credit Information Act, etc.
- Military secrets protected by Military Secret Protection Act, etc.
- Information that is the object of other rights such as trademark right, design right, design right or patent right, etc., or that is owned by third party's copyright.
- Other information that is use prohibited information according to other laws.
3. Public Institution's Liability Exemption
- Public institution does not guarantee the accuracy or continued service of public works.
- Public institution and its employees do not have any liability for any kind of damage or disadvantage that may arise by using public works.
4. Effect of Use Term Violation
- The use permission is automatically terminated when user violates any of the KOGL's Use Terms, and the user shall immediately stop using public works.
</details>
## Data Structure
### Data Instance
```python
>>> from datasets import load_dataset
>>>
>>> ds = load_dataset("Bingsu/national_library_of_korea_book_info", split="train")
>>> ds
Dataset({
features: ['isbn13', 'vol', 'title', 'author', 'publisher', 'price', 'img_url', 'description'],
num_rows: 7919278
})
```
```python
>>> ds.features
{'isbn13': Value(dtype='string', id=None),
'vol': Value(dtype='string', id=None),
'title': Value(dtype='string', id=None),
'author': Value(dtype='string', id=None),
'publisher': Value(dtype='string', id=None),
'price': Value(dtype='string', id=None),
'img_url': Value(dtype='string', id=None),
'description': Value(dtype='string', id=None)}
```
or
```python
>>> import pandas as pd
>>>
>>> url = "https://huggingface.co/datasets/Bingsu/national_library_of_korea_book_info/resolve/main/train.csv.gz"
>>> df = pd.read_csv(url, low_memory=False)
```
```python
>>> df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 7919278 entries, 0 to 7919277
Data columns (total 8 columns):
# Column Dtype
--- ------ -----
0 isbn13 object
1 vol object
2 title object
3 author object
4 publisher object
5 price object
6 img_url object
7 description object
dtypes: object(8)
memory usage: 483.4+ MB
```
### Null data
```python
>>> df.isnull().sum()
isbn13 3277
vol 5933882
title 19662
author 122998
publisher 1007553
price 3096535
img_url 3182882
description 4496194
dtype: int64
```
### Note
```python
>>> df[df["description"].str.contains("[해외주문원서]", regex=False) == True].head()["description"]
10773 [해외주문원서] 고객님의 요청으로 수입 주문하는 도서이므로, 주문취소 및 반품이 불...
95542 [해외주문원서] 고객님의 요청으로 수입 주문하는 도서이므로, 주문취소 및 반품이 불...
95543 [해외주문원서] 고객님의 요청으로 수입 주문하는 도서이므로, 주문취소 및 반품이 불...
96606 [해외주문원서] 고객님의 요청으로 수입 주문하는 도서이므로, 주문취소 및 반품이 불...
96678 [해외주문원서] 고객님의 요청으로 수입 주문하는 도서이므로, 주문취소 및 반품이 불...
Name: description, dtype: object
```
|
JovialValley/phoneme_totaldataset_0 | ---
dataset_info:
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: label
dtype: string
- name: emotion
dtype: string
- name: emotion_str
dtype: string
splits:
- name: train
num_bytes: 163223522.0
num_examples: 389
- name: test
num_bytes: 41231058.0
num_examples: 98
download_size: 138510939
dataset_size: 204454580.0
---
# Dataset Card for "phoneme_totaldataset_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
killah-t-cell/multinose_test_controlnet_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 411804.0
num_examples: 9
download_size: 0
dataset_size: 411804.0
---
# Dataset Card for "multinose_test_controlnet_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kla-20/sai-literature-doc-embeddings | ---
license: apache-2.0
---
|
sunhaha123/ref | ---
license: apache-2.0
---
|
leego/dataset_ml_data | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 393969
num_examples: 1470
download_size: 172473
dataset_size: 393969
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
heliosprime/twitter_dataset_1713059012 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11263
num_examples: 25
download_size: 9943
dataset_size: 11263
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713059012"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DataStudio/OCR_UppercaseARIA | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 499062262.625
num_examples: 12323
download_size: 498987119
dataset_size: 499062262.625
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cahya/instructions-ta | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 2612517.3091275846
num_examples: 1784
- name: test
num_bytes: 146441.55320221887
num_examples: 100
- name: validation
num_bytes: 144977.13767019668
num_examples: 99
download_size: 1024957
dataset_size: 2903936.0
---
# Dataset Card for "instructions-ta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cyanelis/69189881T | ---
license: cc-by-nc-4.0
--- |
SINAI/CRISOL | ---
license: cc-by-nc-sa-4.0
language:
- es
tags:
- Opinion Analysis
pretty_name: CRISOL
---
# CRISOL - Knowledge base of opinions for Spanish
## Description:
CRiSOL is the result of the combination of two linguistic resources for Opinion Analysis. One of these resources is the Spanish opinion word list iSOL, and the other is the English opinion lexicon SentiWordNet. The result has been a filtering of SentiWordNet from the iSOL terms, as well as a resource in which each word has two sources of information, which can be leveraged together or separately.
CRiSOL has the 8135 iSOL entries, of which 4434 also have the SentiWordnet polarity value associated with them.
## Citation
```
@article{PLN5226,
author = {M. Dolores Molina González y Eugenio Martínez Cámara y M. Teresa Martín Valdivia},
title = {CRiSOL: Base de Conocimiento de Opiniones para el Español},
journal = {Procesamiento del Lenguaje Natural},
volume = {55},
number = {0},
year = {2015},
issn = {1989-7553},
url = {http://journal.sepln.org/sepln/ojs/ojs/index.php/pln/article/view/5226},
pages = {143--150}
}
``` |
alaa2111/new_one | ---
license: openrail
---
|
yeonsun/JGLUE-custom | ---
configs:
- config_name: JCoLA
data_files:
- split: train
path: "JCoLA/train.parquet"
- split: validation
path: "JCoLA/validation.parquet"
- config_name: JCommonsenseQA
data_files:
- split: train
path: "JCommonsenseQA/train.parquet"
- split: validation
path: "JCommonsenseQA/validation.parquet"
- config_name: JNLI
data_files:
- split: train
path: "JNLI/train.parquet"
- split: validation
path: "JNLI/validation.parquet"
- config_name: JSQuAD
data_files:
- split: train
path: "JSQuAD/train.parquet"
- split: validation
path: "JSQuAD/validation.parquet"
- config_name: JSTS
data_files:
- split: train
path: "JSTS/train.parquet"
- split: validation
path: "JSTS/validation.parquet"
---
# TEST for creating subsets of Datasets |
Ecstra/factum | ---
license: openrail
task_categories:
- text-classification
language:
- en
pretty_name: FACTUM
size_categories:
- 10M<n<100M
---
Basic SQL Database that classifies claims as true (1) or false (0).
Cleaned FEVER and FEVEROUS dataset and scraped and cleaned politifact website into this DB file. |
BitTranslate/chatgpt-prompts-Persian | ---
license: cc0-1.0
---
|
distilled-from-one-sec-cv12/chunk_267 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1015083940
num_examples: 197795
download_size: 1035682283
dataset_size: 1015083940
---
# Dataset Card for "chunk_267"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xPXXX/stackoverflow_DL-related_questions | ---
license: mit
---
|
sbmaruf/forai_ml_masakhane_mafand | ---
annotations_creators:
- expert-generated
language:
- en
- fr
- am
- bm
- bbj
- ee
- fon
- ha
- ig
- lg
- mos
- ny
- pcm
- rw
- sn
- sw
- tn
- tw
- wo
- xh
- yo
- zu
language_creators:
- expert-generated
license:
- cc-by-nc-4.0
multilinguality:
- translation
- multilingual
pretty_name: mafand
size_categories:
- 1K<n<10K
source_datasets:
- original
tags:
- news, mafand, masakhane
task_categories:
- translation
task_ids: []
---
An unofficial version of https://huggingface.co/datasets/masakhane/mafand
We created a different data loader for a @forai_ml project. |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_221 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1157834236.0
num_examples: 227383
download_size: 1183247786
dataset_size: 1157834236.0
---
# Dataset Card for "chunk_221"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_223 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1248251228
num_examples: 243229
download_size: 1275640729
dataset_size: 1248251228
---
# Dataset Card for "chunk_223"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
0x7o/fialka-v2-dpo | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 6976884
num_examples: 2470
download_size: 2419126
dataset_size: 6976884
---
# Dataset Card for "fialka-v2-dpo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mitsuki-Sakamoto/alpaca_farm-RM-Mistral-7B-re-preference-256-nsample-2 | ---
dataset_info:
config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
splits:
- name: preference
num_bytes: 58969064
num_examples: 20001
download_size: 26765193
dataset_size: 58969064
configs:
- config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
data_files:
- split: preference
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/preference-*
---
|
mnoukhov/openai_summarize_vllm_generated_20k | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 35534924
num_examples: 19940
download_size: 21640884
dataset_size: 35534924
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
StevenLe456/head-pose | ---
dataset_info:
features:
- name: x
dtype: image
- name: y
sequence: int64
splits:
- name: train
num_bytes: 1286594599.25
num_examples: 13950
download_size: 1286514507
dataset_size: 1286594599.25
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_cola_synthetic_superlative | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 77
num_examples: 1
- name: test
num_bytes: 62
num_examples: 1
- name: train
num_bytes: 871
num_examples: 11
download_size: 6263
dataset_size: 1010
---
# Dataset Card for "MULTI_VALUE_cola_synthetic_superlative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/CIFAR10_test_google_flan_t5_xl_mode_A_ns_10000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 3816888
num_examples: 10000
download_size: 1081972
dataset_size: 3816888
---
# Dataset Card for "CIFAR10_test_google_flan_t5_xl_mode_A_ns_10000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rjaiswal/van_cleef | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 743980337.0
num_examples: 165
download_size: 735324133
dataset_size: 743980337.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
recipe_nlg | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- text2text-generation
- text-generation
- fill-mask
- text-retrieval
- summarization
task_ids:
- document-retrieval
- entity-linking-retrieval
- explanation-generation
- language-modeling
- masked-language-modeling
paperswithcode_id: recipenlg
pretty_name: RecipeNLG
dataset_info:
features:
- name: id
dtype: int32
- name: title
dtype: string
- name: ingredients
sequence: string
- name: directions
sequence: string
- name: link
dtype: string
- name: source
dtype:
class_label:
names:
'0': Gathered
'1': Recipes1M
- name: ner
sequence: string
splits:
- name: train
num_bytes: 2194783815
num_examples: 2231142
download_size: 0
dataset_size: 2194783815
---
# Dataset Card for RecipeNLG
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://recipenlg.cs.put.poznan.pl/
- **Repository:** https://github.com/Glorf/recipenlg
- **Paper:** https://www.aclweb.org/anthology/volumes/2020.inlg-1/
- **Leaderboard:** [More Information Needed]
- **Point of Contact:** [More Information Needed]
### Dataset Summary
RecipeNLG: A Cooking Recipes Dataset for Semi-Structured Text Generation.
While the RecipeNLG dataset is based on the Recipe1M+ dataset, it greatly expands the number of recipes available.
The new dataset provides over 1 million new, preprocessed and deduplicated recipes on top of the Recipe1M+ dataset.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The dataset is in English.
## Dataset Structure
### Data Instances
```
{'id': 0,
'title': 'No-Bake Nut Cookies',
'ingredients': ['1 c. firmly packed brown sugar',
'1/2 c. evaporated milk',
'1/2 tsp. vanilla',
'1/2 c. broken nuts (pecans)',
'2 Tbsp. butter or margarine',
'3 1/2 c. bite size shredded rice biscuits'],
'directions': ['In a heavy 2-quart saucepan, mix brown sugar, nuts, evaporated milk and butter or margarine.',
'Stir over medium heat until mixture bubbles all over top.',
'Boil and stir 5 minutes more. Take off heat.',
'Stir in vanilla and cereal; mix well.',
'Using 2 teaspoons, drop and shape into 30 clusters on wax paper.',
'Let stand until firm, about 30 minutes.'],
'link': 'www.cookbooks.com/Recipe-Details.aspx?id=44874',
'source': 0,
'ner': ['brown sugar',
'milk',
'vanilla',
'nuts',
'butter',
'bite size shredded rice biscuits']}
```
### Data Fields
- `id` (`int`): ID.
- `title` (`str`): Title of the recipe.
- `ingredients` (`list` of `str`): Ingredients.
- `directions` (`list` of `str`): Instruction steps.
- `link` (`str`): URL link.
- `source` (`ClassLabel`): Origin of each recipe record, with possible value {"Gathered", "Recipes1M"}:
- "Gathered" (0): Additional recipes gathered from multiple cooking web pages, using automated scripts in a web scraping process.
- "Recipes1M" (1): Recipes from "Recipe1M+" dataset.
- `ner` (`list` of `str`): NER food entities.
### Data Splits
The dataset contains a single `train` split.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
I (the "Researcher") have requested permission to use the RecipeNLG dataset (the "Dataset") at Poznań University of Technology (PUT). In exchange for such permission, Researcher hereby agrees to the following terms and conditions:
1. Researcher shall use the Dataset only for non-commercial research and educational purposes.
2. PUT makes no representations or warranties regarding the Dataset, including but not limited to warranties of non-infringement or fitness for a particular purpose.
3. Researcher accepts full responsibility for his or her use of the Dataset and shall defend and indemnify PUT, including its employees, Trustees, officers and agents, against any and all claims arising from Researcher's use of the Dataset including but not limited to Researcher's use of any copies of copyrighted images or text that he or she may create from the Dataset.
4. Researcher may provide research associates and colleagues with access to the Dataset provided that they first agree to be bound by these terms and conditions.
5. If Researcher is employed by a for-profit, commercial entity, Researcher's employer shall also be bound by these terms and conditions, and Researcher hereby represents that he or she is fully authorized to enter into this agreement on behalf of such employer.
### Citation Information
```bibtex
@inproceedings{bien-etal-2020-recipenlg,
title = "{R}ecipe{NLG}: A Cooking Recipes Dataset for Semi-Structured Text Generation",
author = "Bie{\'n}, Micha{\l} and
Gilski, Micha{\l} and
Maciejewska, Martyna and
Taisner, Wojciech and
Wisniewski, Dawid and
Lawrynowicz, Agnieszka",
booktitle = "Proceedings of the 13th International Conference on Natural Language Generation",
month = dec,
year = "2020",
address = "Dublin, Ireland",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.inlg-1.4",
pages = "22--28",
}
```
### Contributions
Thanks to [@abhishekkrthakur](https://github.com/abhishekkrthakur) for adding this dataset. |
dzsiimon/Primeiro | ---
license: openrail
---
|
maywell/gpt4_evol_1.3k | ---
dataset_info:
features:
- name: answer
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 3467056
num_examples: 1316
download_size: 1918245
dataset_size: 3467056
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gpt4_evol_1.3k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydraLM/partitioned_v3_standardized_027 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_id
dtype: string
splits:
- name: train
num_bytes: 6027807.300735752
num_examples: 11210
download_size: 6004359
dataset_size: 6027807.300735752
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v3_standardized_027"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rcds/swiss_judgment_prediction_xl | ---
license: cc-by-sa-4.0
task_categories:
- text-classification
language:
- it
- de
- fr
pretty_name: Swiss Judgment Prediction XL
size_categories:
- 100K<n<1M
---
# Dataset Card for Swiss Court View Generation
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Swiss Judgment Prediction is a multilingual, diachronic dataset of 329K Swiss Federal Supreme Court (FSCS) cases. This dataset is part of a challenging text generation task.
### Supported Tasks and Leaderboards
### Languages
Switzerland has four official languages with three languages German, French and Italian being represented. The decisions are written by the judges and clerks in the language of the proceedings.
| Language | Subset | Number of Documents Full |
|------------|------------|--------------------------|
| German | **de** | 160K |
| French | **fr** | 128K |
| Italian | **it** | 41K |
## Dataset Structure
### Data Fields
```
- decision_id: unique identifier for the decision
- facts: facts section of the decision
- considerations: considerations section of the decision
- label: label of the decision
- law_area: area of law of the decision
- language: language of the decision
- year: year of the decision
- court: court of the decision
- chamber: chamber of the decision
- canton: canton of the decision
- region: region of the decision
```
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
The original data are published from the Swiss Federal Supreme Court (https://www.bger.ch) in unprocessed formats (HTML). The documents were downloaded from the Entscheidsuche portal (https://entscheidsuche.ch) in HTML.
#### Who are the source language producers?
The decisions are written by the judges and clerks in the language of the proceedings.
### Annotations
#### Annotation process
#### Who are the annotators?
Metadata is published by the Swiss Federal Supreme Court (https://www.bger.ch).
### Personal and Sensitive Information
The dataset contains publicly available court decisions from the Swiss Federal Supreme Court. Personal or sensitive information has been anonymized by the court before publication according to the following guidelines: https://www.bger.ch/home/juridiction/anonymisierungsregeln.html.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
We release the data under CC-BY-4.0 which complies with the court licensing (https://www.bger.ch/files/live/sites/bger/files/pdf/de/urteilsveroeffentlichung_d.pdf)
© Swiss Federal Supreme Court, 2002-2022
The copyright for the editorial content of this website and the consolidated texts, which is owned by the Swiss Federal Supreme Court, is licensed under the Creative Commons Attribution 4.0 International licence. This means that you can re-use the content provided you acknowledge the source and indicate any changes you have made.
Source: https://www.bger.ch/files/live/sites/bger/files/pdf/de/urteilsveroeffentlichung_d.pdf
### Citation Information
Please cite our [ArXiv-Preprint](https://arxiv.org/abs/2306.09237)
```
@misc{rasiah2023scale,
title={SCALE: Scaling up the Complexity for Advanced Language Model Evaluation},
author={Vishvaksenan Rasiah and Ronja Stern and Veton Matoshi and Matthias Stürmer and Ilias Chalkidis and Daniel E. Ho and Joel Niklaus},
year={2023},
eprint={2306.09237},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
|
jinlibao/en-forecasting-bigdata-query-parsed | ---
dataset_info:
features:
- name: symbol
dtype: string
- name: stock_movement
dtype: string
- name: tweet
dtype: string
- name: instruction
dtype: string
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: train
num_bytes: 28343871
num_examples: 4897
- name: test
num_bytes: 3647674
num_examples: 1472
- name: valid
num_bytes: 1961125
num_examples: 798
download_size: 15858301
dataset_size: 33952670
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
iohadrubin/lm_task | ---
dataset_info:
- config_name: default
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 59331744
num_examples: 6324
- name: validation
num_bytes: 9637776
num_examples: 6324
- name: test
num_bytes: 13852388
num_examples: 8745
download_size: 46005998
dataset_size: 82821908
- config_name: v2
features:
- name: input_ids
sequence: int32
- name: ticker
dtype: string
splits:
- name: train
num_bytes: 59379024
num_examples: 6324
- name: validation
num_bytes: 9685056
num_examples: 6324
- name: test
num_bytes: 13393288
num_examples: 8745
- name: all_
num_bytes: 82837224
num_examples: 8745
download_size: 87909071
dataset_size: 165294592
- config_name: v3
features:
- name: input_ids
sequence: int32
- name: ticker
dtype: string
splits:
- name: train
num_bytes: 121806864
num_examples: 6324
- name: validation
num_bytes: 19803456
num_examples: 6324
- name: test
num_bytes: 27385288
num_examples: 8745
- name: all_
num_bytes: 169928104
num_examples: 8745
download_size: 159556827
dataset_size: 338923712
- config_name: v4
features:
- name: input_ids
sequence: int32
- name: ticker
dtype: string
splits:
- name: train
num_bytes: 121806864
num_examples: 6324
- name: validation
num_bytes: 19803456
num_examples: 6324
- name: test
num_bytes: 27385288
num_examples: 8745
- name: all_
num_bytes: 169928104
num_examples: 8745
download_size: 159488311
dataset_size: 338923712
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
- config_name: v2
data_files:
- split: train
path: v2/train-*
- split: validation
path: v2/validation-*
- split: test
path: v2/test-*
- split: all_
path: v2/all_-*
- config_name: v3
data_files:
- split: train
path: v3/train-*
- split: validation
path: v3/validation-*
- split: test
path: v3/test-*
- split: all_
path: v3/all_-*
- config_name: v4
data_files:
- split: train
path: v4/train-*
- split: validation
path: v4/validation-*
- split: test
path: v4/test-*
- split: all_
path: v4/all_-*
---
|
WenyangHui/craigslist-bargain | ---
license: mit
---
|
tyzhu/find_last_sent_train_30_eval_10_hint5 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 90407
num_examples: 70
- name: validation
num_bytes: 11176
num_examples: 10
download_size: 65754
dataset_size: 101583
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "find_last_sent_train_30_eval_10_hint5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-professional_law-neg-answer | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_answer
dtype: string
splits:
- name: test
num_bytes: 2047921
num_examples: 1534
download_size: 1128003
dataset_size: 2047921
---
# Dataset Card for "mmlu-professional_law-neg-answer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ghomasHudson/longdoc_paired_hotpotqa | ---
dataset_info:
features:
- name: input
dtype: string
- name: response_j
dtype: string
- name: response_k
dtype: string
splits:
- name: train
num_bytes: 1349024656
num_examples: 671376
- name: validation
num_bytes: 114260998
num_examples: 57844
download_size: 800718173
dataset_size: 1463285654
---
# Dataset Card for "longdoc_paired_hotpotqa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pnadel/open-homer | ---
dataset_info:
features:
- name: sentid
dtype: string
- name: cit
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 5289378
num_examples: 8000
- name: test
num_bytes: 1336564
num_examples: 2000
download_size: 2858781
dataset_size: 6625942
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
bharat-raghunathan/indian-foods-dataset | ---
license: cc0-1.0
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': biryani
'1': cholebhature
'2': dabeli
'3': dal
'4': dhokla
'5': dosa
'6': jalebi
'7': kathiroll
'8': kofta
'9': naan
'10': pakora
'11': paneer
'12': panipuri
'13': pavbhaji
'14': vadapav
splits:
- name: train
num_bytes: 611741947.222
num_examples: 3809
- name: test
num_bytes: 153961285
num_examples: 961
download_size: 688922167
dataset_size: 765703232.222
task_categories:
- image-classification
- text-to-image
language:
- en
pretty_name: indian-foods
size_categories:
- 1K<n<10K
---
# Dataset Card for Indian Foods Dataset
## Dataset Description
- **Homepage:** https://www.kaggle.com/datasets/anshulmehtakaggl/themassiveindianfooddataset
- **Repository:** https://www.kaggle.com/datasets/anshulmehtakaggl/themassiveindianfooddataset
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** https://www.kaggle.com/anshulmehtakaggl
### Dataset Summary
This is a multi-category(multi-class classification) related Indian food dataset showcasing [The-massive-Indian-Food-Dataset](https://www.kaggle.com/datasets/anshulmehtakaggl/themassiveindianfooddataset).
This card has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
English
## Dataset Structure
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['biryani', 'cholebhature', 'dabeli', 'dal', 'dhokla', 'dosa', 'jalebi', 'kathiroll', 'kofta', 'naan', 'pakora', 'paneer', 'panipuri', 'pavbhaji', 'vadapav'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and test split. The split sizes are as follows:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 3809 |
| test | 961 |
### Data Instances
Each instance is a picture of the Indian food item, along with the category it belongs to.
#### Initial Data Collection and Normalization
Collection by Scraping data from Google Images + Leveraging some JS Functions.
All the images are resized to (300,300) to maintain size uniformity.
### Dataset Curators
[Anshul Mehta](https://www.kaggle.com/anshulmehtakaggl)
### Licensing Information
[CC0: Public Domain](https://creativecommons.org/publicdomain/zero/1.0/)
### Citation Information
[The Massive Indian Foods Dataset](https://www.kaggle.com/datasets/anshulmehtakaggl/themassiveindianfooddataset) |
justram/sections | ---
dataset_info:
features:
- name: text_id
dtype: string
- name: page_url
dtype: string
- name: page_title
dtype: string
- name: section_title
dtype: string
- name: context_page_description
dtype: string
- name: context_section_description
dtype: string
- name: media
sequence: string
- name: hierachy
sequence: string
- name: category
sequence: string
splits:
- name: train
num_bytes: 41979319581
num_examples: 28473864
download_size: 15032145200
dataset_size: 41979319581
---
# Dataset Card for "sections"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
masonwill/buddha_asisa | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 561248216.278
num_examples: 1619
- name: test
num_bytes: 129240718.0
num_examples: 384
download_size: 679838425
dataset_size: 690488934.278
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-staging-eval-project-e1907042-7494828 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- clinc_oos
eval_info:
task: multi_class_classification
model: lewtun/roberta-large-finetuned-clinc
metrics: []
dataset_name: clinc_oos
dataset_config: small
dataset_split: test
col_mapping:
text: text
target: intent
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: lewtun/roberta-large-finetuned-clinc
* Dataset: clinc_oos
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
Kasper7953/github-issues_big | ---
dataset_info:
features:
- name: input_ids
sequence: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 80208096.0
num_examples: 20012
- name: val
num_bytes: 14156256.0
num_examples: 3532
download_size: 26942780
dataset_size: 94364352.0
---
# Dataset Card for "github-issues_big"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Stopwolf/EQ-Bench-Serbian | ---
license: apache-2.0
language:
- sr
- bs
- hr
---
# EQ-Bench-Serbian 🇷🇸
EQ-Bench is a benchmark for language models designed to assess emotional intelligence. You can read more about it in the [paper](https://arxiv.org/abs/2312.06281).
The reason this benchmark was picked is because EQ-Bench in English has very high correlation with LMSYS Arena Elo scores
(has a 0.97 correlation w/ MMLU, and a 0.94 correlation w/ Arena Elo.).
Since it wouldn't be feasible to create an arena for a couple of models available for Serbian, we went in this direction.
This dataset has been translated with the help of OpenAI's GPT-3.5-turbo model. Afterwards, it was manually cleaned and corrected. It is primarily for the Serbian language, but can be used for Bosnian and Croatian.
# Results 📊
<!---Instead of taking the better result between first pass and revised scores, we take revised scores exclusively since they are influenced by the models critique.
If the model "knows" a language, in this case Serbian, usually the revised scores end up being better. If the model just understands the language,
but doesn't know how to command it, the first pass scores will tend to be better (which is the case for some of the models below).--->
Instead of using the better result between first pass and revised scores, we scale them first by the proportion of parsable answers.
This way, we penalize models which seem to be functioning great, but actually don't know Serbian very well (ie. have high scores, but lower parseable answers).
| Model | EQ Bench |
|-------------------------|------------|
| GPT4-0125-preview | 75.82 |
| [Tito](https://huggingface.co/Stopwolf/Tito-7B-slerp) | 58.06 |
| [Tito](https://huggingface.co/Stopwolf/Tito-7B-slerp) + system prompt | 57.64 |
| [Perućac](https://huggingface.co/Stopwolf/Perucac-7B-slerp) (ChatML) | 57.21 |
| GPT3.5-turbo-0125 | 53.68 |
| [Yugo55A-GPT](https://huggingface.co/datatab/Yugo55A-GPT) | 53.55 |
| [Mustra](https://huggingface.co/Stopwolf/Mustra-7B-Instruct-v0.1) | 48.93 |
| [Zamfir](https://huggingface.co/Stopwolf/Zamfir-7B-slerp) | 42.38 |
| [AlphaMonarch](https://huggingface.co/mlabonne/AlphaMonarch-7B) + system prompt| 41.64 |
| [Nous-Hermes-Mistral-DPO](https://huggingface.co/NousResearch/Nous-Hermes-2-Mistral-7B-DPO)*| 41.64 |
| [Yugo60-GPT](https://huggingface.co/datatab/Yugo60-GPT) | 39.36 |
| [Zamfir](https://huggingface.co/Stopwolf/Zamfir-7B-slerp) + system prompt | 37.18 |
| [YugoGPT-Chat-Align](yugochat.com)** | 36.22 |
\* [Nous-Hermes-Mistral-DPO](https://huggingface.co/NousResearch/Nous-Hermes-2-Mistral-7B-DPO) and [AlphaMonarch](https://huggingface.co/mlabonne/AlphaMonarch-7B)
are primarily English models. We used them just to have a reference point since they are one of the stronger English 7B models, and because AlphaMonarch is
used in some of the merges above.
** YugoGPT was used via [yugochat.com](yugochat.com/en), so we presume it is (the best available) chat variant and also aligned with DPO (or some other similar method).
## Findings 🔍
Couple of expected and unexpected findings:
1. GPT4-turbo (0125-preview version is the best currently available model for Serbian among evaluated models),
2. There are already some models that are better than GPT3.5-turbo (0125 version),
3. YugoGPT-Chat-Align unexpectedly scores very low,
4. Perućac-7B-slerp (merge targeted to have high scores on this benchmark, WestLake-7B-v2 & YugoGPT) indeed had high scores, although I'm not sure it possesses
good control of Serbian language.
5. We expected the models to perform better, not worse when adding the system prompt*. Idea behind doing so was to center it around Serbian language from the start.
\* The system prompt mentioned and used here is a direct translation of Mistral's system prompt:
`Ti si pošten i iskren asistent pomoćnik. Uvek odgovaraj što korisnije možeš. Ako pitanje nema smisla, ili nije koherentno,
objasni zašto je tako umesto da odgovaraš netačno. Ako ne znaš odgovor na pitanje, molim te da ne odgovaraš sa netačnim informacijama.`
# To-do 📋
* have to add scores for some remaining GPT models in order to se how other models compare
* add scores for other closed models such as Gemini, Mistral-Large, Claude etc. |
shujatoor/test_dataset-1 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2953
num_examples: 22
download_size: 3408
dataset_size: 2953
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mlqa | ---
pretty_name: MLQA (MultiLingual Question Answering)
language:
- en
- de
- es
- ar
- zh
- vi
- hi
license:
- cc-by-sa-3.0
source_datasets:
- original
size_categories:
- 10K<n<100K
language_creators:
- crowdsourced
annotations_creators:
- crowdsourced
multilinguality:
- multilingual
task_categories:
- question-answering
task_ids:
- extractive-qa
paperswithcode_id: mlqa
dataset_info:
- config_name: mlqa-translate-train.ar
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 101227245
num_examples: 78058
- name: validation
num_bytes: 13144332
num_examples: 9512
download_size: 63364123
dataset_size: 114371577
- config_name: mlqa-translate-train.de
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 77996825
num_examples: 80069
- name: validation
num_bytes: 10322113
num_examples: 9927
download_size: 63364123
dataset_size: 88318938
- config_name: mlqa-translate-train.vi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 97387431
num_examples: 84816
- name: validation
num_bytes: 12731112
num_examples: 10356
download_size: 63364123
dataset_size: 110118543
- config_name: mlqa-translate-train.zh
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 55143547
num_examples: 76285
- name: validation
num_bytes: 7418070
num_examples: 9568
download_size: 63364123
dataset_size: 62561617
- config_name: mlqa-translate-train.es
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 80789653
num_examples: 81810
- name: validation
num_bytes: 10718376
num_examples: 10123
download_size: 63364123
dataset_size: 91508029
- config_name: mlqa-translate-train.hi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 168117671
num_examples: 82451
- name: validation
num_bytes: 22422152
num_examples: 10253
download_size: 63364123
dataset_size: 190539823
- config_name: mlqa-translate-test.ar
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 5484467
num_examples: 5335
download_size: 10075488
dataset_size: 5484467
- config_name: mlqa-translate-test.de
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 3884332
num_examples: 4517
download_size: 10075488
dataset_size: 3884332
- config_name: mlqa-translate-test.vi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 5998327
num_examples: 5495
download_size: 10075488
dataset_size: 5998327
- config_name: mlqa-translate-test.zh
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 4831704
num_examples: 5137
download_size: 10075488
dataset_size: 4831704
- config_name: mlqa-translate-test.es
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 3916758
num_examples: 5253
download_size: 10075488
dataset_size: 3916758
- config_name: mlqa-translate-test.hi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 4608811
num_examples: 4918
download_size: 10075488
dataset_size: 4608811
- config_name: mlqa.ar.ar
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 8216837
num_examples: 5335
- name: validation
num_bytes: 808830
num_examples: 517
download_size: 75719050
dataset_size: 9025667
- config_name: mlqa.ar.de
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 2132247
num_examples: 1649
- name: validation
num_bytes: 358554
num_examples: 207
download_size: 75719050
dataset_size: 2490801
- config_name: mlqa.ar.vi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 3235363
num_examples: 2047
- name: validation
num_bytes: 283834
num_examples: 163
download_size: 75719050
dataset_size: 3519197
- config_name: mlqa.ar.zh
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 3175660
num_examples: 1912
- name: validation
num_bytes: 334016
num_examples: 188
download_size: 75719050
dataset_size: 3509676
- config_name: mlqa.ar.en
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 8074057
num_examples: 5335
- name: validation
num_bytes: 794775
num_examples: 517
download_size: 75719050
dataset_size: 8868832
- config_name: mlqa.ar.es
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 2981237
num_examples: 1978
- name: validation
num_bytes: 223188
num_examples: 161
download_size: 75719050
dataset_size: 3204425
- config_name: mlqa.ar.hi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 2993225
num_examples: 1831
- name: validation
num_bytes: 276727
num_examples: 186
download_size: 75719050
dataset_size: 3269952
- config_name: mlqa.de.ar
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 1587005
num_examples: 1649
- name: validation
num_bytes: 195822
num_examples: 207
download_size: 75719050
dataset_size: 1782827
- config_name: mlqa.de.de
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 4274496
num_examples: 4517
- name: validation
num_bytes: 477366
num_examples: 512
download_size: 75719050
dataset_size: 4751862
- config_name: mlqa.de.vi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 1654540
num_examples: 1675
- name: validation
num_bytes: 211985
num_examples: 182
download_size: 75719050
dataset_size: 1866525
- config_name: mlqa.de.zh
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 1645937
num_examples: 1621
- name: validation
num_bytes: 180114
num_examples: 190
download_size: 75719050
dataset_size: 1826051
- config_name: mlqa.de.en
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 4251153
num_examples: 4517
- name: validation
num_bytes: 474863
num_examples: 512
download_size: 75719050
dataset_size: 4726016
- config_name: mlqa.de.es
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 1678176
num_examples: 1776
- name: validation
num_bytes: 166193
num_examples: 196
download_size: 75719050
dataset_size: 1844369
- config_name: mlqa.de.hi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 1343983
num_examples: 1430
- name: validation
num_bytes: 150679
num_examples: 163
download_size: 75719050
dataset_size: 1494662
- config_name: mlqa.vi.ar
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 3164094
num_examples: 2047
- name: validation
num_bytes: 226724
num_examples: 163
download_size: 75719050
dataset_size: 3390818
- config_name: mlqa.vi.de
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 2189315
num_examples: 1675
- name: validation
num_bytes: 272794
num_examples: 182
download_size: 75719050
dataset_size: 2462109
- config_name: mlqa.vi.vi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 7807045
num_examples: 5495
- name: validation
num_bytes: 715291
num_examples: 511
download_size: 75719050
dataset_size: 8522336
- config_name: mlqa.vi.zh
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 2947458
num_examples: 1943
- name: validation
num_bytes: 265154
num_examples: 184
download_size: 75719050
dataset_size: 3212612
- config_name: mlqa.vi.en
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 7727204
num_examples: 5495
- name: validation
num_bytes: 707925
num_examples: 511
download_size: 75719050
dataset_size: 8435129
- config_name: mlqa.vi.es
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 2822481
num_examples: 2018
- name: validation
num_bytes: 279235
num_examples: 189
download_size: 75719050
dataset_size: 3101716
- config_name: mlqa.vi.hi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 2738045
num_examples: 1947
- name: validation
num_bytes: 251470
num_examples: 177
download_size: 75719050
dataset_size: 2989515
- config_name: mlqa.zh.ar
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 1697005
num_examples: 1912
- name: validation
num_bytes: 171743
num_examples: 188
download_size: 75719050
dataset_size: 1868748
- config_name: mlqa.zh.de
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 1356268
num_examples: 1621
- name: validation
num_bytes: 170686
num_examples: 190
download_size: 75719050
dataset_size: 1526954
- config_name: mlqa.zh.vi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 1770535
num_examples: 1943
- name: validation
num_bytes: 169651
num_examples: 184
download_size: 75719050
dataset_size: 1940186
- config_name: mlqa.zh.zh
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 4324740
num_examples: 5137
- name: validation
num_bytes: 433960
num_examples: 504
download_size: 75719050
dataset_size: 4758700
- config_name: mlqa.zh.en
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 4353361
num_examples: 5137
- name: validation
num_bytes: 437016
num_examples: 504
download_size: 75719050
dataset_size: 4790377
- config_name: mlqa.zh.es
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 1697983
num_examples: 1947
- name: validation
num_bytes: 134693
num_examples: 161
download_size: 75719050
dataset_size: 1832676
- config_name: mlqa.zh.hi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 1547159
num_examples: 1767
- name: validation
num_bytes: 180928
num_examples: 189
download_size: 75719050
dataset_size: 1728087
- config_name: mlqa.en.ar
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 6641971
num_examples: 5335
- name: validation
num_bytes: 621075
num_examples: 517
download_size: 75719050
dataset_size: 7263046
- config_name: mlqa.en.de
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 4966262
num_examples: 4517
- name: validation
num_bytes: 584725
num_examples: 512
download_size: 75719050
dataset_size: 5550987
- config_name: mlqa.en.vi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 6958087
num_examples: 5495
- name: validation
num_bytes: 631268
num_examples: 511
download_size: 75719050
dataset_size: 7589355
- config_name: mlqa.en.zh
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 6441614
num_examples: 5137
- name: validation
num_bytes: 598772
num_examples: 504
download_size: 75719050
dataset_size: 7040386
- config_name: mlqa.en.en
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 13787522
num_examples: 11590
- name: validation
num_bytes: 1307399
num_examples: 1148
download_size: 75719050
dataset_size: 15094921
- config_name: mlqa.en.es
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 6074990
num_examples: 5253
- name: validation
num_bytes: 545657
num_examples: 500
download_size: 75719050
dataset_size: 6620647
- config_name: mlqa.en.hi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 6293785
num_examples: 4918
- name: validation
num_bytes: 614223
num_examples: 507
download_size: 75719050
dataset_size: 6908008
- config_name: mlqa.es.ar
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 1696778
num_examples: 1978
- name: validation
num_bytes: 145105
num_examples: 161
download_size: 75719050
dataset_size: 1841883
- config_name: mlqa.es.de
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 1361983
num_examples: 1776
- name: validation
num_bytes: 139968
num_examples: 196
download_size: 75719050
dataset_size: 1501951
- config_name: mlqa.es.vi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 1707141
num_examples: 2018
- name: validation
num_bytes: 172801
num_examples: 189
download_size: 75719050
dataset_size: 1879942
- config_name: mlqa.es.zh
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 1635294
num_examples: 1947
- name: validation
num_bytes: 122829
num_examples: 161
download_size: 75719050
dataset_size: 1758123
- config_name: mlqa.es.en
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 4249431
num_examples: 5253
- name: validation
num_bytes: 408169
num_examples: 500
download_size: 75719050
dataset_size: 4657600
- config_name: mlqa.es.es
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 4281273
num_examples: 5253
- name: validation
num_bytes: 411196
num_examples: 500
download_size: 75719050
dataset_size: 4692469
- config_name: mlqa.es.hi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 1489611
num_examples: 1723
- name: validation
num_bytes: 178003
num_examples: 187
download_size: 75719050
dataset_size: 1667614
- config_name: mlqa.hi.ar
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 4374373
num_examples: 1831
- name: validation
num_bytes: 402817
num_examples: 186
download_size: 75719050
dataset_size: 4777190
- config_name: mlqa.hi.de
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 2961556
num_examples: 1430
- name: validation
num_bytes: 294325
num_examples: 163
download_size: 75719050
dataset_size: 3255881
- config_name: mlqa.hi.vi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 4664436
num_examples: 1947
- name: validation
num_bytes: 411654
num_examples: 177
download_size: 75719050
dataset_size: 5076090
- config_name: mlqa.hi.zh
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 4281309
num_examples: 1767
- name: validation
num_bytes: 416192
num_examples: 189
download_size: 75719050
dataset_size: 4697501
- config_name: mlqa.hi.en
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 11245629
num_examples: 4918
- name: validation
num_bytes: 1076115
num_examples: 507
download_size: 75719050
dataset_size: 12321744
- config_name: mlqa.hi.es
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 3789337
num_examples: 1723
- name: validation
num_bytes: 412469
num_examples: 187
download_size: 75719050
dataset_size: 4201806
- config_name: mlqa.hi.hi
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: answer_start
dtype: int32
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: test
num_bytes: 11606982
num_examples: 4918
- name: validation
num_bytes: 1115055
num_examples: 507
download_size: 75719050
dataset_size: 12722037
---
# Dataset Card for "mlqa"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/facebookresearch/MLQA](https://github.com/facebookresearch/MLQA)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 4.15 GB
- **Size of the generated dataset:** 910.01 MB
- **Total amount of disk used:** 5.06 GB
### Dataset Summary
MLQA (MultiLingual Question Answering) is a benchmark dataset for evaluating cross-lingual question answering performance.
MLQA consists of over 5K extractive QA instances (12K in English) in SQuAD format in seven languages - English, Arabic,
German, Spanish, Hindi, Vietnamese and Simplified Chinese. MLQA is highly parallel, with QA instances parallel between
4 different languages on average.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
MLQA contains QA instances in 7 languages, English, Arabic, German, Spanish, Hindi, Vietnamese and Simplified Chinese.
## Dataset Structure
### Data Instances
#### mlqa-translate-test.ar
- **Size of downloaded dataset files:** 10.08 MB
- **Size of the generated dataset:** 5.48 MB
- **Total amount of disk used:** 15.56 MB
An example of 'test' looks as follows.
```
```
#### mlqa-translate-test.de
- **Size of downloaded dataset files:** 10.08 MB
- **Size of the generated dataset:** 3.88 MB
- **Total amount of disk used:** 13.96 MB
An example of 'test' looks as follows.
```
```
#### mlqa-translate-test.es
- **Size of downloaded dataset files:** 10.08 MB
- **Size of the generated dataset:** 3.92 MB
- **Total amount of disk used:** 13.99 MB
An example of 'test' looks as follows.
```
```
#### mlqa-translate-test.hi
- **Size of downloaded dataset files:** 10.08 MB
- **Size of the generated dataset:** 4.61 MB
- **Total amount of disk used:** 14.68 MB
An example of 'test' looks as follows.
```
```
#### mlqa-translate-test.vi
- **Size of downloaded dataset files:** 10.08 MB
- **Size of the generated dataset:** 6.00 MB
- **Total amount of disk used:** 16.07 MB
An example of 'test' looks as follows.
```
```
### Data Fields
The data fields are the same among all splits.
#### mlqa-translate-test.ar
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `answer_start`: a `int32` feature.
- `text`: a `string` feature.
- `id`: a `string` feature.
#### mlqa-translate-test.de
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `answer_start`: a `int32` feature.
- `text`: a `string` feature.
- `id`: a `string` feature.
#### mlqa-translate-test.es
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `answer_start`: a `int32` feature.
- `text`: a `string` feature.
- `id`: a `string` feature.
#### mlqa-translate-test.hi
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `answer_start`: a `int32` feature.
- `text`: a `string` feature.
- `id`: a `string` feature.
#### mlqa-translate-test.vi
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `answer_start`: a `int32` feature.
- `text`: a `string` feature.
- `id`: a `string` feature.
### Data Splits
| name |test|
|----------------------|---:|
|mlqa-translate-test.ar|5335|
|mlqa-translate-test.de|4517|
|mlqa-translate-test.es|5253|
|mlqa-translate-test.hi|4918|
|mlqa-translate-test.vi|5495|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{lewis2019mlqa,
title = {MLQA: Evaluating Cross-lingual Extractive Question Answering},
author = {Lewis, Patrick and Oguz, Barlas and Rinott, Ruty and Riedel, Sebastian and Schwenk, Holger},
journal = {arXiv preprint arXiv:1910.07475},
year = 2019,
eid = {arXiv: 1910.07475}
}
```
### Contributions
Thanks to [@patrickvonplaten](https://github.com/patrickvonplaten), [@M-Salti](https://github.com/M-Salti), [@lewtun](https://github.com/lewtun), [@thomwolf](https://github.com/thomwolf), [@mariamabarham](https://github.com/mariamabarham), [@lhoestq](https://github.com/lhoestq) for adding this dataset. |
GEM-submissions/lewtun__this-is-another-test-name__1655983106 | ---
benchmark: gem
type: prediction
submission_name: This is another test name
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: This is another test name
|
bigscience-data/roots_indic-gu_wikiquote | ---
language: gu
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-gu_wikiquote
# wikiquote_filtered
- Dataset uid: `wikiquote_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.0462 % of total
- 0.1697 % of en
- 0.0326 % of fr
- 0.0216 % of ar
- 0.0066 % of zh
- 0.0833 % of pt
- 0.0357 % of es
- 0.0783 % of indic-ta
- 0.0361 % of indic-hi
- 0.0518 % of ca
- 0.0405 % of vi
- 0.0834 % of indic-ml
- 0.0542 % of indic-te
- 0.1172 % of indic-gu
- 0.0634 % of indic-kn
- 0.0539 % of id
- 0.0454 % of indic-ur
- 0.0337 % of indic-mr
- 0.0347 % of eu
### BigScience processing steps
#### Filters applied to: en
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_en
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_fr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: ar
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ar
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: zh
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_zhs
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_pt
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_es
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ta
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-hi
- dedup_template_soft
- filter_small_docs_bytes_300
#### Filters applied to: ca
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ca
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_vi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ml
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-te
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-gu
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-kn
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: id
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_id
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ur
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-mr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_eu
- dedup_template_soft
- replace_newline_with_space
|
fathyshalab/massive_recommendation-de-DE | ---
dataset_info:
features:
- name: id
dtype: string
- name: locale
dtype: string
- name: partition
dtype: string
- name: scenario
dtype:
class_label:
names:
'0': social
'1': transport
'2': calendar
'3': play
'4': news
'5': datetime
'6': recommendation
'7': email
'8': iot
'9': general
'10': audio
'11': lists
'12': qa
'13': cooking
'14': takeaway
'15': music
'16': alarm
'17': weather
- name: intent
dtype:
class_label:
names:
'0': datetime_query
'1': iot_hue_lightchange
'2': transport_ticket
'3': takeaway_query
'4': qa_stock
'5': general_greet
'6': recommendation_events
'7': music_dislikeness
'8': iot_wemo_off
'9': cooking_recipe
'10': qa_currency
'11': transport_traffic
'12': general_quirky
'13': weather_query
'14': audio_volume_up
'15': email_addcontact
'16': takeaway_order
'17': email_querycontact
'18': iot_hue_lightup
'19': recommendation_locations
'20': play_audiobook
'21': lists_createoradd
'22': news_query
'23': alarm_query
'24': iot_wemo_on
'25': general_joke
'26': qa_definition
'27': social_query
'28': music_settings
'29': audio_volume_other
'30': calendar_remove
'31': iot_hue_lightdim
'32': calendar_query
'33': email_sendemail
'34': iot_cleaning
'35': audio_volume_down
'36': play_radio
'37': cooking_query
'38': datetime_convert
'39': qa_maths
'40': iot_hue_lightoff
'41': iot_hue_lighton
'42': transport_query
'43': music_likeness
'44': email_query
'45': play_music
'46': audio_volume_mute
'47': social_post
'48': alarm_set
'49': qa_factoid
'50': calendar_set
'51': play_game
'52': alarm_remove
'53': lists_remove
'54': transport_taxi
'55': recommendation_movies
'56': iot_coffee
'57': music_query
'58': play_podcasts
'59': lists_query
- name: text
dtype: string
- name: annot_utt
dtype: string
- name: worker_id
dtype: string
- name: slot_method
sequence:
- name: slot
dtype: string
- name: method
dtype: string
- name: judgments
sequence:
- name: worker_id
dtype: string
- name: intent_score
dtype: int8
- name: slots_score
dtype: int8
- name: grammar_score
dtype: int8
- name: spelling_score
dtype: int8
- name: language_identification
dtype: string
- name: label_name
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 137660
num_examples: 433
- name: validation
num_bytes: 22189
num_examples: 69
- name: test
num_bytes: 31179
num_examples: 94
download_size: 67251
dataset_size: 191028
---
# Dataset Card for "massive_recommendation-de-DE"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kannada-LLM-Labs/Wikipedia-Kn | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 402848197
num_examples: 31437
download_size: 147074910
dataset_size: 402848197
license: mit
task_categories:
- text-generation
language:
- kn
size_categories:
- 10K<n<100K
---
# Dataset Card for "Wikipedia-Kn"
This is a filtered version of the [Wikipedia](https://huggingface.co/datasets/wikimedia/wikipedia) dataset only containing samples of Kannada language.
The dataset contains total of 31437 samples.
### Data Sample:
```python
{'id': '832',
'url': 'https://kn.wikipedia.org/wiki/%E0%B2%A1%E0%B2%BF.%E0%B2%B5%E0%B2%BF.%E0%B2%97%E0%B3%81%E0%B2%82%E0%B2%A1%E0%B2%AA%E0%B3%8D%E0%B2%AA',
'title': 'ಡಿ.ವಿ.ಗುಂಡಪ್ಪ',
'text': 'ಡಿ ವಿ ಜಿ(ಮಾರ್ಚ್ ೧೭, ೧೮೮೭ - ಅಕ್ಟೋಬರ್ ೭, ೧೯೭೫) ಎಂಬ ಹೆಸರಿನಿಂದ ಪ್ರಸಿದ್ಧರಾದ ಡಾ. ದೇವನಹಳ್ಳಿ ವೆಂಕಟರಮಣಯ್ಯ ಗುಂಡಪ್ಪನವರು ಕರ್ನಾಟಕದ ಪ್ರಸಿದ್ಧ ಸಾಹಿತಿ, ಪತ್ರಕರ್ತರು. ಹಲವು ಕ್ಷೇತ್ರಗಳಲ್ಲಿ ಸೇವೆ ಸಲ್ಲಿಸಿದ ಇವರು ಕನ್ನಡದ ಆಧುನಿಕ ಸರ್ವಜ್ಞ ಎಂದೇ ಪ್ರಸಿದ್ಧರಾದವರು.\n\nಬಾಲ್ಯ ಜೀವನ\nಡಿ.ವಿ.ಜಿ ಅವರು ೧೮೮೭, ಮಾರ್ಚ್ ೧೭ರಂದು ಕೋಲಾರ ಜಿಲ್ಲೆಯ ಮುಳಬಾಗಿಲು ತಾಲೂಕಿನ ದೇವನಹಳ್ಳಿಯಲ್ಲಿ ಜನಿಸಿದರು.\n\nವೃತ್ತಿ ಜೀವನ\nಪ್ರೌಢಶಾಲೆಯಲ್ಲಿ\n\nಸಾಹಿತ್ಯ ಕೃಷಿ\nದಿವಾನ್ ರಂಗಾಚಾರ್ಯ ಅವರ ಬಗ್ಗೆ ಇಂಗ್ಲಿಷಿನಲ್ಲಿ ಬರೆದ ಲೇಖನ ಡಿ.ವಿ.ಜಿ ಅವರ ಬದುಕಲ್ಲಿ ಹೊಸ ತಿರುವು ಪಡೆಯಿತು. ಮುಂದೆ ಪುಸ್ತಕ ರೂಪಕ್ಕೆ ತರಲು ಹಲವು ಮಾರ್ಪಾಡು ಮಾಡಿದರು. ಇದು ಪ್ರಕಟವಾಗುತ್ತಿದ್ದಂ....."
}
```
### Use with Datasets
```python
from datasets import load_dataset
ds = load_dataset("Kannada-LLM-Labs/Wikipedia-Kn")
``` |
breno30/KarlosWonney | ---
license: openrail
---
|
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test_cot_v3-math-237e7b-2016766699 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test_cot_v3
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-66b_eval
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test_cot_v3
dataset_config: mathemakitten--winobias_antistereotype_test_cot_v3
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-66b_eval
* Dataset: mathemakitten/winobias_antistereotype_test_cot_v3
* Config: mathemakitten--winobias_antistereotype_test_cot_v3
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
mikhail-panzo/processed_malay_dataset_normalized | ---
dataset_info:
features:
- name: speaker_embeddings
sequence: float32
- name: input_ids
sequence: int32
- name: labels
sequence:
sequence: float32
splits:
- name: train
num_bytes: 627391612
num_examples: 4944
download_size: 625741867
dataset_size: 627391612
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.