datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
inditechie/cavalier | ---
license: unknown
---
|
Helsinki-NLP/opus_ubuntu | ---
annotations_creators:
- crowdsourced
- expert-generated
language_creators:
- found
language:
- ace
- af
- ak
- am
- an
- ang
- ar
- ary
- as
- ast
- az
- ba
- bal
- be
- bem
- ber
- bg
- bho
- bn
- bo
- br
- brx
- bs
- bua
- byn
- ca
- ce
- ceb
- chr
- ckb
- co
- crh
- cs
- csb
- cv
- cy
- da
- de
- dsb
- dv
- dz
- el
- en
- eo
- es
- et
- eu
- fa
- ff
- fi
- fil
- fo
- fr
- frm
- frp
- fur
- fy
- ga
- gd
- gl
- gn
- grc
- gu
- guc
- gv
- ha
- haw
- he
- hi
- hil
- hne
- hr
- hsb
- ht
- hu
- hy
- ia
- id
- ig
- io
- is
- it
- iu
- ja
- jbo
- jv
- ka
- kab
- kg
- kk
- kl
- km
- kn
- ko
- kok
- ks
- ksh
- ku
- kw
- ky
- la
- lb
- lg
- li
- lij
- lld
- ln
- lo
- lt
- ltg
- lv
- mai
- mg
- mh
- mhr
- mi
- miq
- mk
- ml
- mn
- mr
- ms
- mt
- mus
- my
- nan
- nap
- nb
- nds
- ne
- nhn
- nl
- nn
- 'no'
- nso
- ny
- oc
- om
- or
- os
- pa
- pam
- pap
- pl
- pms
- pmy
- ps
- pt
- qu
- rm
- ro
- rom
- ru
- rw
- sa
- sc
- sco
- sd
- se
- shn
- shs
- si
- sk
- sl
- sm
- sml
- sn
- so
- son
- sq
- sr
- st
- sv
- sw
- syr
- szl
- ta
- te
- tet
- tg
- th
- ti
- tk
- tl
- tlh
- tr
- trv
- ts
- tt
- ug
- uk
- ur
- uz
- ve
- vec
- vi
- wa
- wae
- wo
- xal
- xh
- yi
- yo
- zh
- zu
- zza
license:
- bsd-3-clause
multilinguality:
- multilingual
size_categories:
- 10K<n<100K
- 1K<n<10K
- n<1K
source_datasets:
- original
task_categories:
- translation
task_ids: []
pretty_name: Opus Ubuntu
config_names:
- as-bs
- az-cs
- bg-de
- bn-ga
- br-es_PR
- br-hi
- br-la
- br-uz
- br-yi
- bs-szl
language_bcp47:
- ar-SY
- bn-IN
- de-AT
- de-DE
- en-AU
- en-CA
- en-GB
- en-NZ
- en-US
- es-AR
- es-CL
- es-CO
- es-CR
- es-DO
- es-EC
- es-ES
- es-GT
- es-HN
- es-MX
- es-NI
- es-PA
- es-PE
- es-PR
- es-SV
- es-UY
- es-VE
- fa-AF
- fr-CA
- fr-FR
- nl-NL
- pt-BR
- pt-PT
- ta-LK
- zh-CN
- zh-HK
- zh-TW
dataset_info:
- config_name: as-bs
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- as
- bs
splits:
- name: train
num_bytes: 1037799
num_examples: 8583
download_size: 470874
dataset_size: 1037799
- config_name: az-cs
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- az
- cs
splits:
- name: train
num_bytes: 17809
num_examples: 293
download_size: 14637
dataset_size: 17809
- config_name: bg-de
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- bg
- de
splits:
- name: train
num_bytes: 27615
num_examples: 184
download_size: 16278
dataset_size: 27615
- config_name: bn-ga
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- bn
- ga
splits:
- name: train
num_bytes: 584617
num_examples: 7324
download_size: 272247
dataset_size: 584617
- config_name: br-es_PR
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- br
- es_PR
splits:
- name: train
num_bytes: 8863
num_examples: 125
download_size: 8194
dataset_size: 8863
- config_name: br-hi
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- br
- hi
splits:
- name: train
num_bytes: 1300057
num_examples: 15551
download_size: 641803
dataset_size: 1300057
- config_name: br-la
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- br
- la
splits:
- name: train
num_bytes: 29329
num_examples: 527
download_size: 17723
dataset_size: 29329
- config_name: br-uz
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- br
- uz
splits:
- name: train
num_bytes: 110266
num_examples: 1416
download_size: 62660
dataset_size: 110266
- config_name: br-yi
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- br
- yi
splits:
- name: train
num_bytes: 172834
num_examples: 2799
download_size: 77870
dataset_size: 172834
- config_name: bs-szl
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- bs
- szl
splits:
- name: train
num_bytes: 41104
num_examples: 646
download_size: 30035
dataset_size: 41104
configs:
- config_name: as-bs
data_files:
- split: train
path: as-bs/train-*
- config_name: az-cs
data_files:
- split: train
path: az-cs/train-*
- config_name: bg-de
data_files:
- split: train
path: bg-de/train-*
- config_name: bn-ga
data_files:
- split: train
path: bn-ga/train-*
- config_name: br-es_PR
data_files:
- split: train
path: br-es_PR/train-*
- config_name: br-hi
data_files:
- split: train
path: br-hi/train-*
- config_name: br-la
data_files:
- split: train
path: br-la/train-*
- config_name: br-uz
data_files:
- split: train
path: br-uz/train-*
- config_name: br-yi
data_files:
- split: train
path: br-yi/train-*
- config_name: bs-szl
data_files:
- split: train
path: bs-szl/train-*
---
# Dataset Card for Opus Ubuntu
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://opus.nlpl.eu/Ubuntu.php
- **Repository:** None
- **Paper:** http://www.lrec-conf.org/proceedings/lrec2012/pdf/463_Paper.pdf
- **Leaderboard:** [More Information Needed]
- **Point of Contact:** [More Information Needed]
### Dataset Summary
These are translations of the Ubuntu software package messages, donated by the Ubuntu community.
To load a language pair which isn't part of the config, all you need to do is specify the language code as pairs.
You can find the valid pairs in Homepage section of Dataset Description: http://opus.nlpl.eu/Ubuntu.php
E.g.
`dataset = load_dataset("opus_ubuntu", lang1="it", lang2="pl")`
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
Example instance:
```
{
'id': '0',
'translation': {
'it': 'Comprende Gmail, Google Docs, Google+, YouTube e Picasa',
'pl': 'Zawiera Gmail, Google Docs, Google+, YouTube oraz Picasa'
}
}
```
### Data Fields
Each instance has two fields:
- **id**: the id of the example
- **translation**: a dictionary containing translated texts in two languages.
### Data Splits
Each subset simply consists in a train set. We provide the number of examples for certain language pairs:
| | train |
|:---------|--------:|
| as-bs | 8583 |
| az-cs | 293 |
| bg-de | 184 |
| br-es_PR | 125 |
| bn-ga | 7324 |
| br-hi | 15551 |
| br-la | 527 |
| bs-szl | 646 |
| br-uz | 1416 |
| br-yi | 2799 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
BSD "Revised" license (see (https://help.launchpad.net/Legal#Translations_copyright)[https://help.launchpad.net/Legal#Translations_copyright])
### Citation Information
```bibtex
@InProceedings{TIEDEMANN12.463,
author = {J{\"o}rg Tiedemann},
title = {Parallel Data, Tools and Interfaces in OPUS},
booktitle = {Proceedings of the Eight International Conference on Language Resources and Evaluation (LREC'12)},
year = {2012},
month = {may},
date = {23-25},
address = {Istanbul, Turkey},
editor = {Nicoletta Calzolari (Conference Chair) and Khalid Choukri and Thierry Declerck and Mehmet Ugur Dogan and Bente Maegaard and Joseph Mariani and Jan Odijk and Stelios Piperidis},
publisher = {European Language Resources Association (ELRA)},
isbn = {978-2-9517408-7-7},
language = {english}
}
```
### Contributions
Thanks to [@rkc007](https://github.com/rkc007) for adding this dataset. |
naklecha/ArchitecturalDigestDiningRoomEmbeddings | ---
license: mit
task_categories:
- summarization
- sentence-similarity
tags:
- design
- Architectural Digest
- interior design
- dining rooms
size_categories:
- 1K<n<10K
---
# Architectural Digest Dining Room Embeddings
This dataset is a collection of 2288 description embeddings of dining rooms from Architectural Digest. |
open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.03-128k | ---
pretty_name: Evaluation run of perlthoughts/Chupacabra-7B-v2.03-128k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [perlthoughts/Chupacabra-7B-v2.03-128k](https://huggingface.co/perlthoughts/Chupacabra-7B-v2.03-128k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.03-128k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-11T01:11:36.377069](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.03-128k/blob/main/results_2023-12-11T01-11-36.377069.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6308159681070862,\n\
\ \"acc_stderr\": 0.03241075436743821,\n \"acc_norm\": 0.6340939515561961,\n\
\ \"acc_norm_stderr\": 0.03306369804375197,\n \"mc1\": 0.35495716034271724,\n\
\ \"mc1_stderr\": 0.0167508623813759,\n \"mc2\": 0.5116101369020933,\n\
\ \"mc2_stderr\": 0.015317262236176729\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6015358361774744,\n \"acc_stderr\": 0.014306946052735565,\n\
\ \"acc_norm\": 0.6467576791808873,\n \"acc_norm_stderr\": 0.013967822714840056\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6435968930491934,\n\
\ \"acc_stderr\": 0.004779574402771385,\n \"acc_norm\": 0.8456482772356104,\n\
\ \"acc_norm_stderr\": 0.0036054721167622923\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.042446332383532265,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.042446332383532265\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.038781398887976104,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.038781398887976104\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.049020713000019756,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.049020713000019756\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382182,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382182\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047712,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047712\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.024283140529467305,\n\
\ \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.024283140529467305\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114996,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114996\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650154,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650154\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474086,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474086\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233504,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233504\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516304,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516304\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.02363687331748928,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.02363687331748928\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31843575418994413,\n\
\ \"acc_stderr\": 0.015581008080360276,\n \"acc_norm\": 0.31843575418994413,\n\
\ \"acc_norm_stderr\": 0.015581008080360276\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596728,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596728\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873862,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873862\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n\
\ \"acc_stderr\": 0.012740853872949832,\n \"acc_norm\": 0.4661016949152542,\n\
\ \"acc_norm_stderr\": 0.012740853872949832\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681397,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681397\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696644,\n \
\ \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696644\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484375,\n\
\ \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484375\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35495716034271724,\n\
\ \"mc1_stderr\": 0.0167508623813759,\n \"mc2\": 0.5116101369020933,\n\
\ \"mc2_stderr\": 0.015317262236176729\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8105761641673244,\n \"acc_stderr\": 0.011012790432989248\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5049279757391963,\n \
\ \"acc_stderr\": 0.01377181577547058\n }\n}\n```"
repo_url: https://huggingface.co/perlthoughts/Chupacabra-7B-v2.03-128k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|arc:challenge|25_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|gsm8k|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hellaswag|10_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T01-11-36.377069.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-11T01-11-36.377069.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- '**/details_harness|winogrande|5_2023-12-11T01-11-36.377069.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-11T01-11-36.377069.parquet'
- config_name: results
data_files:
- split: 2023_12_11T01_11_36.377069
path:
- results_2023-12-11T01-11-36.377069.parquet
- split: latest
path:
- results_2023-12-11T01-11-36.377069.parquet
---
# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2.03-128k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/perlthoughts/Chupacabra-7B-v2.03-128k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [perlthoughts/Chupacabra-7B-v2.03-128k](https://huggingface.co/perlthoughts/Chupacabra-7B-v2.03-128k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.03-128k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-11T01:11:36.377069](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.03-128k/blob/main/results_2023-12-11T01-11-36.377069.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6308159681070862,
"acc_stderr": 0.03241075436743821,
"acc_norm": 0.6340939515561961,
"acc_norm_stderr": 0.03306369804375197,
"mc1": 0.35495716034271724,
"mc1_stderr": 0.0167508623813759,
"mc2": 0.5116101369020933,
"mc2_stderr": 0.015317262236176729
},
"harness|arc:challenge|25": {
"acc": 0.6015358361774744,
"acc_stderr": 0.014306946052735565,
"acc_norm": 0.6467576791808873,
"acc_norm_stderr": 0.013967822714840056
},
"harness|hellaswag|10": {
"acc": 0.6435968930491934,
"acc_stderr": 0.004779574402771385,
"acc_norm": 0.8456482772356104,
"acc_norm_stderr": 0.0036054721167622923
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532265,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532265
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.049020713000019756,
"acc_norm": 0.61,
"acc_norm_stderr": 0.049020713000019756
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382182,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382182
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047712,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047712
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.024283140529467305,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.024283140529467305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114996,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114996
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650154,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650154
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474086,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474086
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233504,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233504
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516304,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516304
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.02363687331748928,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.02363687331748928
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368985,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368985
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31843575418994413,
"acc_stderr": 0.015581008080360276,
"acc_norm": 0.31843575418994413,
"acc_norm_stderr": 0.015581008080360276
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873862,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873862
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.012740853872949832,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.012740853872949832
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681397,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681397
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696644,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696644
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484375,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484375
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35495716034271724,
"mc1_stderr": 0.0167508623813759,
"mc2": 0.5116101369020933,
"mc2_stderr": 0.015317262236176729
},
"harness|winogrande|5": {
"acc": 0.8105761641673244,
"acc_stderr": 0.011012790432989248
},
"harness|gsm8k|5": {
"acc": 0.5049279757391963,
"acc_stderr": 0.01377181577547058
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Codec-SUPERB/audioset_unit | ---
dataset_info:
features:
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: academicodec_hifi_16k_320d
num_bytes: 319021293
num_examples: 20111
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 319021293
num_examples: 20111
- name: academicodec_hifi_24k_320d
num_bytes: 478780301
num_examples: 20111
- name: audiodec_24k_320d
num_bytes: 1022122429
num_examples: 20111
- name: dac_16k
num_bytes: 1346532845
num_examples: 20111
- name: dac_24k
num_bytes: 5481407517
num_examples: 20111
- name: dac_44k
num_bytes: 1779120473
num_examples: 20111
- name: encodec_24k_12bps
num_bytes: 1916511325
num_examples: 20111
- name: encodec_24k_1_5bps
num_bytes: 239898261
num_examples: 20111
- name: encodec_24k_24bps
num_bytes: 3832640541
num_examples: 20111
- name: encodec_24k_3bps
num_bytes: 479414413
num_examples: 20111
- name: encodec_24k_6bps
num_bytes: 958446717
num_examples: 20111
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 2554643997
num_examples: 20111
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 2554643997
num_examples: 20111
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 2554568477
num_examples: 20111
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 1281207325
num_examples: 20111
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 2554568477
num_examples: 20111
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 1281207325
num_examples: 20111
- name: speech_tokenizer_16k
num_bytes: 638928701
num_examples: 20111
download_size: 4579471975
dataset_size: 31592685707
configs:
- config_name: default
data_files:
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
---
|
shionhonda/reviewer2-1k | ---
license: mit
---
|
semeru/code-code-BugFixingMed | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: validation
num_bytes: 4047457
num_examples: 6546
- name: train
num_bytes: 32300602
num_examples: 52364
- name: test
num_bytes: 4024395
num_examples: 6545
download_size: 0
dataset_size: 40372454
---
# Dataset Card for "BFmed_finetuning"
## Reference
<pre><code>@article{Mastropaolo2022TransferLearningForCodeRelatedTasks
title={Using Transfer Learning for Code-Related Tasks},
author={Mastropaolo, Antonio and Cooper, Nathan and Nader Palacio, David and Scalabrino, Simone and
Poshyvanyk, Denys and Oliveto, Rocco and Bavota, Gabriele},
journal={arXiv preprint arXiv:2206.08574},
year={2022}
}</code></pre> |
Seongill/TriviaQA_conflict_5_half_small | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: substitute
dtype: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: is_conflict
dtype: bool
- name: num_replace
dtype: int64
- name: num_answer
dtype: int64
splits:
- name: train
num_bytes: 13753993
num_examples: 3771
download_size: 8347170
dataset_size: 13753993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/wikir_es13k | ---
pretty_name: '`wikir/es13k`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `wikir/es13k`
The `wikir/es13k` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/wikir#wikir/es13k).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=645,901
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/wikir_es13k', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Frej2020Wikir,
title={WIKIR: A Python toolkit for building a large-scale Wikipedia-based English Information Retrieval Dataset},
author={Jibril Frej and Didier Schwab and Jean-Pierre Chevallet},
booktitle={LREC},
year={2020}
}
@inproceedings{Frej2020MlWikir,
title={MLWIKIR: A Python Toolkit for Building Large-scale Wikipedia-based Information Retrieval Datasets in Chinese, English, French, Italian, Japanese, Spanish and More},
author={Jibril Frej and Didier Schwab and Jean-Pierre Chevallet},
booktitle={CIRCLE},
year={2020}
}
```
|
cannlytics/cannabis_analytes | ---
pretty_name: cannabis_analytes
license:
- cc-by-4.0
---
# Cannabis Analytes
This dataset consists of analyte data for various analytes that are regularly tested for in cannabis. The dataset consists of sub-datasets for each type of test, as well as a sub-dataset that includes all analytes.
## Dataset Structure
The dataset is partitioned into 18 subsets for each state and the aggregate.
| State | Code | Status |
| [All](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/analytes.json) | `all` | ✅ |
| [Cannabinoids](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/cannabinoids.json) | `cannabinoids` | ✅ |
| [Terpenes](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/terpenes.json) | `terpenes` | ✅ |
| Pesticides | `pesticides` | ⏳ Coming soon |
| Microbes | `microbes` | ⏳ Coming soon |
| Heavy metals | `heavy_metals` | ⏳ Coming soon |
| Residual solvents | `residual_solvents` | ⏳ Coming soon |
| Other | `other` | ⏳ Coming soon |
## Using the Dataset
You can load all the analytes, or the analytes for a specific test. For example:
```py
from datasets import load_dataset
# Get all of the analytes
dataset = load_dataset('cannlytics/cannabis_licenses', 'all')
analytes = dataset['data']
# Get the cannabinoids.
dataset = load_dataset('cannlytics/cannabis_licenses', 'cannabinoids')
terpenes = dataset['data']
# Get the terpenes.
dataset = load_dataset('cannlytics/cannabis_licenses', 'terpenes')
terpenes = dataset['data']
```
## Data Fields
Below is a non-exhaustive list of fields, used to standardize the various data that are encountered, that you may expect to find for each observation.
## Data Fields
Below is a non-exhaustive list of fields used to standardize the various data that are encountered. You may expect to find the following for each observation:
| Field | Example | Description |
|------------------------------|----------------------------------------------|------------------------------------------------------------------------------------------------------|
| `key` | `"thca"` | A unique ID for each analyte. |
| `description` | `"Δ-9-Tetrahydrocannabinol is a cannabinoid..."` | A brief description or summary about the analyte. |
| `name` | `"THC"` | Common name of the analyte. |
| `scientific_name` | `"\u0394-9-Tetrahydrocannabinol"` | The scientific name or IUPAC name of the analyte. |
| `type` | `"cannabinoid"` | The type or classification of the analyte (e.g., terpene, cannabinoid). |
| `wikipedia_url` | `"https://en.wikipedia.org/wiki/Tetrahydrocannabinol"` | The Wikipedia URL where more detailed information can be found about the analyte. |
| `degrades_to` | `["cannabinol"]` | A list of chemicals or substances the analyte degrades to. |
| `precursors` | `["thca"]` | A list of precursor chemicals or substances related to the analyte. |
| `subtype` | `"psychoactive"` | A sub-classification or additional details about the type of the analyte. |
| `cas_number` | `"1972-08-3"` | The Chemical Abstracts Service (CAS) registry number, which is a unique identifier for chemical substances.|
| `chemical_formula` | `"C21H30O2"` | The chemical formula of the analyte. |
| `molar_mass` | `"314.5 g/mol"` | The molar mass of the analyte. |
| `density` | `"1.0±0.1 g/cm3"` | The density of the analyte. |
| `boiling_point` | `"383.5±42.0 °C"` | The boiling point of the analyte. |
| `image_url` | `"https://example.com/image.jpg"` | URL of an image representing the analyte. |
| `chemical_formula_image_url` | `"https://example.com/formula_image.jpg"` | URL of an image representing the chemical formula of the analyte. |
## Data Splits
The data is split into subsets by analysis. You can retrieve all analytes by requesting the `all` subset.
```py
from datasets import load_dataset
# Get all cannabis licenses.
dataset = load_dataset('cannlytics/cannabis_licenses', 'all')
data = dataset['data']
```
## Curation Rationale
This dataset provides a standard set of analyte data for [cannabis tests](https://huggingface.co/datasets/cannlytics/cannabis_tests).
## Data Collection and Normalization
The `get_cannabis_analytes.py` routine is used to normalize values collected from Wikipedia.
## Known Limitations
The datasets are not complete and may include inaccurate information.
## Dataset Curators
Curated by [🔥Cannlytics](https://cannlytics.com)<br>
<contact@cannlytics.com>
## License
```
Copyright (c) 2023 Cannlytics
The files associated with this dataset are licensed under a
Creative Commons Attribution 4.0 International license.
You can share, copy and modify this dataset so long as you give
appropriate credit, provide a link to the CC BY license, and
indicate if changes were made, but you may not do so in a way
that suggests the rights holder has endorsed you or your use of
the dataset. Note that further permission may be required for
any content within the dataset that is identified as belonging
to a third party.
```
## Contributions
Thanks to [🔥Cannlytics](https://cannlytics.com), [@candy-o](https://github.com/candy-o), [@keeganskeate](https://github.com/keeganskeate), and the entire [Cannabis Data Science Team](https://meetup.com/cannabis-data-science/members) for their contributions.
|
EleutherAI/quirky_nli | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: id
dtype: string
- name: choices
sequence: string
- name: bob_label
dtype: int64
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: character
dtype: string
- name: alice_label
dtype: int64
splits:
- name: train
num_bytes: 2649409
num_examples: 11207
- name: validation
num_bytes: 960473
num_examples: 4000
- name: test
num_bytes: 949969
num_examples: 4000
download_size: 1226234
dataset_size: 4559851
---
# Dataset Card for "quirky_nli"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yuntian-deng/gpt2-detectability-topk40 | ---
dataset_info:
features:
- name: ended
dtype: bool
- name: sentence
dtype: string
- name: label
dtype: int64
- name: length
dtype: int64
splits:
- name: train
num_bytes: 1388622488
num_examples: 500000
- name: validation
num_bytes: 27827134
num_examples: 10000
- name: test
num_bytes: 27283980
num_examples: 10000
download_size: 872800966
dataset_size: 1443733602
---
# Dataset Card for "gpt2-detectability-topk40"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ChartThinker/Chart-Sum-QA | ---
license: mit
---
|
heliosprime/twitter_dataset_1713195304 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 16629
num_examples: 45
download_size: 16536
dataset_size: 16629
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713195304"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Amirkid/milspotify | ---
license: creativeml-openrail-m
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 75339255
num_examples: 2427716
download_size: 38213804
dataset_size: 75339255
---
|
yangwang825/sst2-textbugger-7 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: augment
dtype: string
splits:
- name: train
num_bytes: 7067172
num_examples: 53134
- name: validation
num_bytes: 110096
num_examples: 872
- name: test
num_bytes: 226340
num_examples: 1821
download_size: 1839479
dataset_size: 7403608
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_gpt2 | ---
pretty_name: Evaluation run of gpt2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gpt2](https://huggingface.co/gpt2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 65 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 25 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gpt2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-23T06:18:16.565546](https://huggingface.co/datasets/open-llm-leaderboard/details_gpt2/blob/main/results_2024-03-23T06-18-16.565546.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25780579051672486,\n\
\ \"acc_stderr\": 0.030658881019520554,\n \"acc_norm\": 0.2586547713391113,\n\
\ \"acc_norm_stderr\": 0.031431381356225356,\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.4069116400376613,\n\
\ \"mc2_stderr\": 0.014934250122346554\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.197098976109215,\n \"acc_stderr\": 0.011625047669880633,\n\
\ \"acc_norm\": 0.22013651877133106,\n \"acc_norm_stderr\": 0.01210812488346097\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.29267078271260705,\n\
\ \"acc_stderr\": 0.004540586983229993,\n \"acc_norm\": 0.3152758414658435,\n\
\ \"acc_norm_stderr\": 0.0046367607625228515\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073462,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073462\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.16447368421052633,\n \"acc_stderr\": 0.0301675334686327,\n\
\ \"acc_norm\": 0.16447368421052633,\n \"acc_norm_stderr\": 0.0301675334686327\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.17,\n\
\ \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \
\ \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.026341480371118345,\n\
\ \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.026341480371118345\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.28,\n\
\ \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.16,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.16,\n\
\ \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2723404255319149,\n \"acc_stderr\": 0.029101290698386698,\n\
\ \"acc_norm\": 0.2723404255319149,\n \"acc_norm_stderr\": 0.029101290698386698\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.14285714285714285,\n\
\ \"acc_stderr\": 0.0312984318574381,\n \"acc_norm\": 0.14285714285714285,\n\
\ \"acc_norm_stderr\": 0.0312984318574381\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.15,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2967741935483871,\n \"acc_stderr\": 0.025988500792411894,\n \"\
acc_norm\": 0.2967741935483871,\n \"acc_norm_stderr\": 0.025988500792411894\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.270935960591133,\n \"acc_stderr\": 0.03127090713297698,\n \"acc_norm\"\
: 0.270935960591133,\n \"acc_norm_stderr\": 0.03127090713297698\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\
acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\
\ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2717948717948718,\n \"acc_stderr\": 0.022556551010132358,\n\
\ \"acc_norm\": 0.2717948717948718,\n \"acc_norm_stderr\": 0.022556551010132358\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.28991596638655465,\n \"acc_stderr\": 0.029472485833136098,\n\
\ \"acc_norm\": 0.28991596638655465,\n \"acc_norm_stderr\": 0.029472485833136098\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969654,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969654\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3486238532110092,\n\
\ \"acc_stderr\": 0.020431254090714328,\n \"acc_norm\": 0.3486238532110092,\n\
\ \"acc_norm_stderr\": 0.020431254090714328\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n\
\ \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.24472573839662448,\n \"acc_stderr\": 0.027985699387036416,\n\
\ \"acc_norm\": 0.24472573839662448,\n \"acc_norm_stderr\": 0.027985699387036416\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2914798206278027,\n\
\ \"acc_stderr\": 0.030500283176545923,\n \"acc_norm\": 0.2914798206278027,\n\
\ \"acc_norm_stderr\": 0.030500283176545923\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.32231404958677684,\n \"acc_stderr\": 0.04266416363352168,\n \"\
acc_norm\": 0.32231404958677684,\n \"acc_norm_stderr\": 0.04266416363352168\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.03957835471980981,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.03957835471980981\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.34951456310679613,\n \"acc_stderr\": 0.04721188506097173,\n\
\ \"acc_norm\": 0.34951456310679613,\n \"acc_norm_stderr\": 0.04721188506097173\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.1794871794871795,\n\
\ \"acc_stderr\": 0.025140935950335418,\n \"acc_norm\": 0.1794871794871795,\n\
\ \"acc_norm_stderr\": 0.025140935950335418\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21583652618135377,\n\
\ \"acc_stderr\": 0.014711684386139958,\n \"acc_norm\": 0.21583652618135377,\n\
\ \"acc_norm_stderr\": 0.014711684386139958\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.0230836585869842,\n\
\ \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.0230836585869842\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21895424836601307,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.21895424836601307,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24758842443729903,\n\
\ \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.24758842443729903,\n\
\ \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880592,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880592\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329376,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329376\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528034,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528034\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n\
\ \"acc_stderr\": 0.029705284056772426,\n \"acc_norm\": 0.22885572139303484,\n\
\ \"acc_norm_stderr\": 0.029705284056772426\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1927710843373494,\n\
\ \"acc_stderr\": 0.030709824050565274,\n \"acc_norm\": 0.1927710843373494,\n\
\ \"acc_norm_stderr\": 0.030709824050565274\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.4069116400376613,\n\
\ \"mc2_stderr\": 0.014934250122346554\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5043409629044988,\n \"acc_stderr\": 0.014051956064076887\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006823351023502654,\n \
\ \"acc_stderr\": 0.0022675371022544736\n }\n}\n```"
repo_url: https://huggingface.co/gpt2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|arc:challenge|25_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|arc:challenge|25_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|arc:challenge|25_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|arc:challenge|25_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|arc:challenge|25_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|arc:challenge|25_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|arc:challenge|25_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|arc:challenge|25_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_drop_0
data_files:
- split: 2023_09_14T13_54_21.687636
path:
- '**/details_harness|drop|0_2023-09-14T13-54-21.687636.parquet'
- split: 2023_09_15T12_28_23.937147
path:
- '**/details_harness|drop|0_2023-09-15T12-28-23.937147.parquet'
- split: 2023_09_15T12_47_31.231445
path:
- '**/details_harness|drop|0_2023-09-15T12-47-31.231445.parquet'
- split: latest
path:
- '**/details_harness|drop|0_2023-09-15T12-47-31.231445.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|drop|3_2023-11-21T18-07-07.067275.parquet'
- split: 2023_11_29T12_47_35.686694
path:
- '**/details_harness|drop|3_2023-11-29T12-47-35.686694.parquet'
- split: 2023_11_29T12_58_42.860611
path:
- '**/details_harness|drop|3_2023-11-29T12-58-42.860611.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-29T12-58-42.860611.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|gsm8k|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_11_29T12_47_35.686694
path:
- '**/details_harness|gsm8k|5_2023-11-29T12-47-35.686694.parquet'
- split: 2023_11_29T12_58_42.860611
path:
- '**/details_harness|gsm8k|5_2023-11-29T12-58-42.860611.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|gsm8k|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|gsm8k|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|gsm8k|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|gsm8k|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|gsm8k|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|gsm8k|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|gsm8k|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hellaswag|10_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hellaswag|10_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hellaswag|10_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hellaswag|10_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hellaswag|10_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hellaswag|10_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hellaswag|10_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hellaswag|10_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-21T18-07-07.067275.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T13-32-55.332102.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-19T14-19-42.718116.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T15-28-59.872701.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T14-42-55.873500.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T14-12-21.064569.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T13-56-20.291666.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T06-18-16.565546.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-21T18-07-07.067275.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-23T06-18-16.565546.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_06T15_19_52.414673
path:
- '**/details_harness|winogrande|5_2023-09-06T15-19-52.414673.parquet'
- split: 2023_09_06T15_22_24.734466
path:
- '**/details_harness|winogrande|5_2023-09-06T15-22-24.734466.parquet'
- split: 2023_09_06T15_24_04.768979
path:
- '**/details_harness|winogrande|5_2023-09-06T15-24-04.768979.parquet'
- split: 2023_09_07T12_01_51.839651
path:
- '**/details_harness|winogrande|5_2023-09-07T12-01-51.839651.parquet'
- split: 2023_09_07T12_04_01.189528
path:
- '**/details_harness|winogrande|5_2023-09-07T12-04-01.189528.parquet'
- split: 2023_09_07T12_08_17.821371
path:
- '**/details_harness|winogrande|5_2023-09-07T12-08-17.821371.parquet'
- split: 2023_09_07T12_10_30.286469
path:
- '**/details_harness|winogrande|5_2023-09-07T12-10-30.286469.parquet'
- split: 2023_11_21T18_07_07.067275
path:
- '**/details_harness|winogrande|5_2023-11-21T18-07-07.067275.parquet'
- split: 2023_11_29T12_47_35.686694
path:
- '**/details_harness|winogrande|5_2023-11-29T12-47-35.686694.parquet'
- split: 2023_11_29T12_58_42.860611
path:
- '**/details_harness|winogrande|5_2023-11-29T12-58-42.860611.parquet'
- split: 2023_12_16T13_32_55.332102
path:
- '**/details_harness|winogrande|5_2023-12-16T13-32-55.332102.parquet'
- split: 2023_12_19T14_19_42.718116
path:
- '**/details_harness|winogrande|5_2023-12-19T14-19-42.718116.parquet'
- split: 2023_12_23T15_28_59.872701
path:
- '**/details_harness|winogrande|5_2023-12-23T15-28-59.872701.parquet'
- split: 2024_01_10T14_42_55.873500
path:
- '**/details_harness|winogrande|5_2024-01-10T14-42-55.873500.parquet'
- split: 2024_01_18T14_12_21.064569
path:
- '**/details_harness|winogrande|5_2024-01-18T14-12-21.064569.parquet'
- split: 2024_01_22T13_56_20.291666
path:
- '**/details_harness|winogrande|5_2024-01-22T13-56-20.291666.parquet'
- split: 2024_03_23T06_18_16.565546
path:
- '**/details_harness|winogrande|5_2024-03-23T06-18-16.565546.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-23T06-18-16.565546.parquet'
- config_name: results
data_files:
- split: 2023_09_06T12_19_07.283399
path:
- results_2023-09-06T12-19-07.283399.parquet
- split: 2023_09_06T12_21_24.071294
path:
- results_2023-09-06T12-21-24.071294.parquet
- split: 2023_09_06T12_24_13.323279
path:
- results_2023-09-06T12-24-13.323279.parquet
- split: 2023_09_06T13_26_17.619860
path:
- results_2023-09-06T13-26-17.619860.parquet
- split: 2023_09_06T15_15_44.379880
path:
- results_2023-09-06T15-15-44.379880.parquet
- split: 2023_09_06T15_19_52.414673
path:
- results_2023-09-06T15-19-52.414673.parquet
- split: 2023_09_06T15_22_24.734466
path:
- results_2023-09-06T15-22-24.734466.parquet
- split: 2023_09_06T15_24_04.768979
path:
- results_2023-09-06T15-24-04.768979.parquet
- split: 2023_09_07T12_01_51.839651
path:
- results_2023-09-07T12-01-51.839651.parquet
- split: 2023_09_07T12_04_01.189528
path:
- results_2023-09-07T12-04-01.189528.parquet
- split: 2023_09_07T12_08_17.821371
path:
- results_2023-09-07T12-08-17.821371.parquet
- split: 2023_09_07T12_10_30.286469
path:
- results_2023-09-07T12-10-30.286469.parquet
- split: 2023_09_14T13_54_21.687636
path:
- results_2023-09-14T13-54-21.687636.parquet
- split: 2023_09_15T12_28_23.937147
path:
- results_2023-09-15T12-28-23.937147.parquet
- split: 2023_09_15T12_47_31.231445
path:
- results_2023-09-15T12-47-31.231445.parquet
- split: 2023_11_21T18_07_07.067275
path:
- results_2023-11-21T18-07-07.067275.parquet
- split: 2023_11_29T12_47_35.686694
path:
- results_2023-11-29T12-47-35.686694.parquet
- split: 2023_11_29T12_58_42.860611
path:
- results_2023-11-29T12-58-42.860611.parquet
- split: 2023_12_16T13_32_55.332102
path:
- results_2023-12-16T13-32-55.332102.parquet
- split: 2023_12_19T14_19_42.718116
path:
- results_2023-12-19T14-19-42.718116.parquet
- split: 2023_12_23T15_28_59.872701
path:
- results_2023-12-23T15-28-59.872701.parquet
- split: 2024_01_10T14_42_55.873500
path:
- results_2024-01-10T14-42-55.873500.parquet
- split: 2024_01_18T14_12_21.064569
path:
- results_2024-01-18T14-12-21.064569.parquet
- split: 2024_01_22T13_56_20.291666
path:
- results_2024-01-22T13-56-20.291666.parquet
- split: 2024_03_23T06_18_16.565546
path:
- results_2024-03-23T06-18-16.565546.parquet
- split: latest
path:
- results_2024-03-23T06-18-16.565546.parquet
---
# Dataset Card for Evaluation run of gpt2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gpt2](https://huggingface.co/gpt2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 65 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 25 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gpt2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-23T06:18:16.565546](https://huggingface.co/datasets/open-llm-leaderboard/details_gpt2/blob/main/results_2024-03-23T06-18-16.565546.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25780579051672486,
"acc_stderr": 0.030658881019520554,
"acc_norm": 0.2586547713391113,
"acc_norm_stderr": 0.031431381356225356,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.4069116400376613,
"mc2_stderr": 0.014934250122346554
},
"harness|arc:challenge|25": {
"acc": 0.197098976109215,
"acc_stderr": 0.011625047669880633,
"acc_norm": 0.22013651877133106,
"acc_norm_stderr": 0.01210812488346097
},
"harness|hellaswag|10": {
"acc": 0.29267078271260705,
"acc_stderr": 0.004540586983229993,
"acc_norm": 0.3152758414658435,
"acc_norm_stderr": 0.0046367607625228515
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073462,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073462
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.16447368421052633,
"acc_stderr": 0.0301675334686327,
"acc_norm": 0.16447368421052633,
"acc_norm_stderr": 0.0301675334686327
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.026341480371118345,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.026341480371118345
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2723404255319149,
"acc_stderr": 0.029101290698386698,
"acc_norm": 0.2723404255319149,
"acc_norm_stderr": 0.029101290698386698
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.14285714285714285,
"acc_stderr": 0.0312984318574381,
"acc_norm": 0.14285714285714285,
"acc_norm_stderr": 0.0312984318574381
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.15,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.15,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2967741935483871,
"acc_stderr": 0.025988500792411894,
"acc_norm": 0.2967741935483871,
"acc_norm_stderr": 0.025988500792411894
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.03127090713297698,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.03127090713297698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2717948717948718,
"acc_stderr": 0.022556551010132358,
"acc_norm": 0.2717948717948718,
"acc_norm_stderr": 0.022556551010132358
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.28991596638655465,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.28991596638655465,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969654,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3486238532110092,
"acc_stderr": 0.020431254090714328,
"acc_norm": 0.3486238532110092,
"acc_norm_stderr": 0.020431254090714328
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24472573839662448,
"acc_stderr": 0.027985699387036416,
"acc_norm": 0.24472573839662448,
"acc_norm_stderr": 0.027985699387036416
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2914798206278027,
"acc_stderr": 0.030500283176545923,
"acc_norm": 0.2914798206278027,
"acc_norm_stderr": 0.030500283176545923
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.32231404958677684,
"acc_stderr": 0.04266416363352168,
"acc_norm": 0.32231404958677684,
"acc_norm_stderr": 0.04266416363352168
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.03957835471980981,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.03957835471980981
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.34951456310679613,
"acc_stderr": 0.04721188506097173,
"acc_norm": 0.34951456310679613,
"acc_norm_stderr": 0.04721188506097173
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.1794871794871795,
"acc_stderr": 0.025140935950335418,
"acc_norm": 0.1794871794871795,
"acc_norm_stderr": 0.025140935950335418
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.21583652618135377,
"acc_stderr": 0.014711684386139958,
"acc_norm": 0.21583652618135377,
"acc_norm_stderr": 0.014711684386139958
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0230836585869842,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0230836585869842
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21895424836601307,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.21895424836601307,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24758842443729903,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.24758842443729903,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880592,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880592
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329376,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329376
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528034,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528034
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772426,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772426
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1927710843373494,
"acc_stderr": 0.030709824050565274,
"acc_norm": 0.1927710843373494,
"acc_norm_stderr": 0.030709824050565274
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.4069116400376613,
"mc2_stderr": 0.014934250122346554
},
"harness|winogrande|5": {
"acc": 0.5043409629044988,
"acc_stderr": 0.014051956064076887
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.0022675371022544736
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/langley_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of langley/ラングレー/兰利 (Azur Lane)
This is the dataset of langley/ラングレー/兰利 (Azur Lane), containing 33 images and their tags.
The core tags of this character are `glasses, long_hair, green_hair, hair_bun, bangs, breasts, brown_eyes, yellow_eyes, bow, ribbon, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 33 | 34.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/langley_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 33 | 23.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/langley_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 81 | 48.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/langley_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 33 | 31.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/langley_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 81 | 59.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/langley_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/langley_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, pantyhose, solo, looking_at_viewer, simple_background, skirt, white_background, blush, drill_hair, full_body, hair_ornament, navel, open_mouth, riding_crop |
| 1 | 14 |  |  |  |  |  | long_sleeves, looking_at_viewer, collared_shirt, solo, white_shirt, 1girl, holding, open_mouth, black_skirt, miniskirt, simple_background, single_side_bun, striped, blush, hair_between_eyes, pencil_skirt, black_footwear, open_coat, standing, white_background, full_body, necktie, shoes, :o, american_flag_print, aqua_hair, black_gloves, black_jacket, black_pantyhose, blue_bowtie, buttons, chestnut_mouth, double_bun, hair_ribbon, pointer, red_bow, riding_crop, single_hair_bun, star_print |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | pantyhose | solo | looking_at_viewer | simple_background | skirt | white_background | blush | drill_hair | full_body | hair_ornament | navel | open_mouth | riding_crop | long_sleeves | collared_shirt | white_shirt | holding | black_skirt | miniskirt | single_side_bun | striped | hair_between_eyes | pencil_skirt | black_footwear | open_coat | standing | necktie | shoes | :o | american_flag_print | aqua_hair | black_gloves | black_jacket | black_pantyhose | blue_bowtie | buttons | chestnut_mouth | double_bun | hair_ribbon | pointer | red_bow | single_hair_bun | star_print |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:-------|:--------------------|:--------------------|:--------|:-------------------|:--------|:-------------|:------------|:----------------|:--------|:-------------|:--------------|:---------------|:-----------------|:--------------|:----------|:--------------|:------------|:------------------|:----------|:--------------------|:---------------|:-----------------|:------------|:-----------|:----------|:--------|:-----|:----------------------|:------------|:---------------|:---------------|:------------------|:--------------|:----------|:-----------------|:-------------|:--------------|:----------|:----------|:------------------|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | | X | X | X | | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
FINNUMBER/FINCH_TRAIN_QA_1200_per400 | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 5536899
num_examples: 1200
download_size: 2978417
dataset_size: 5536899
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nlp-brin-id/unsup-title-fact | ---
license: apache-2.0
---
|
kernelguardian/flant5action_data | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: dialogue
dtype: string
- name: summary
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 31143103
num_examples: 5390
- name: validation
num_bytes: 31143103
num_examples: 5390
download_size: 11048336
dataset_size: 62286206
---
# Dataset Card for "flant5action"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jingwora/livedoor-news-pair-similarity-ja | ---
dataset_info:
features:
- name: topic1
dtype: string
- name: text1
dtype: string
- name: topic2
dtype: string
- name: text2
dtype: string
- name: similarity
dtype: int64
splits:
- name: train
num_bytes: 110717033
num_examples: 19900
download_size: 5536643
dataset_size: 110717033
---
# Dataset Card for "livedoor-news-pair-similarity-ja"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amay01/if-that-works | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 61828454
num_examples: 175780
download_size: 13569477
dataset_size: 61828454
---
# Dataset Card for "if-that-works"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_pmking27__PrathameshLLM-7B | ---
pretty_name: Evaluation run of pmking27/PrathameshLLM-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pmking27/PrathameshLLM-7B](https://huggingface.co/pmking27/PrathameshLLM-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pmking27__PrathameshLLM-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-03T06:08:43.810683](https://huggingface.co/datasets/open-llm-leaderboard/details_pmking27__PrathameshLLM-7B/blob/main/results_2024-04-03T06-08-43.810683.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3849806020290835,\n\
\ \"acc_stderr\": 0.03414181792488407,\n \"acc_norm\": 0.38869030790086967,\n\
\ \"acc_norm_stderr\": 0.03493120175163532,\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.447487248289777,\n\
\ \"mc2_stderr\": 0.01450127878309395\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4283276450511945,\n \"acc_stderr\": 0.014460496367599017,\n\
\ \"acc_norm\": 0.4496587030716723,\n \"acc_norm_stderr\": 0.014537144444284743\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.506970722963553,\n\
\ \"acc_stderr\": 0.0049892964711570715,\n \"acc_norm\": 0.682832105158335,\n\
\ \"acc_norm_stderr\": 0.004644223294727726\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3815789473684211,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.3815789473684211,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4339622641509434,\n \"acc_stderr\": 0.030503292013342596,\n\
\ \"acc_norm\": 0.4339622641509434,\n \"acc_norm_stderr\": 0.030503292013342596\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3958333333333333,\n\
\ \"acc_stderr\": 0.04089465449325583,\n \"acc_norm\": 0.3958333333333333,\n\
\ \"acc_norm_stderr\": 0.04089465449325583\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.31213872832369943,\n\
\ \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.31213872832369943,\n\
\ \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.040233822736177476,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.040233822736177476\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.03141082197596239,\n\
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.03141082197596239\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.022860838309232072,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.022860838309232072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.41935483870967744,\n \"acc_stderr\": 0.02807158890109184,\n \"\
acc_norm\": 0.41935483870967744,\n \"acc_norm_stderr\": 0.02807158890109184\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3399014778325123,\n \"acc_stderr\": 0.0333276906841079,\n \"acc_norm\"\
: 0.3399014778325123,\n \"acc_norm_stderr\": 0.0333276906841079\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.36363636363636365,\n \"acc_stderr\": 0.03756335775187897,\n\
\ \"acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.03756335775187897\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.45454545454545453,\n \"acc_stderr\": 0.03547601494006937,\n \"\
acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.03547601494006937\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5025906735751295,\n \"acc_stderr\": 0.03608390745384487,\n\
\ \"acc_norm\": 0.5025906735751295,\n \"acc_norm_stderr\": 0.03608390745384487\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.38974358974358975,\n \"acc_stderr\": 0.024726967886647078,\n\
\ \"acc_norm\": 0.38974358974358975,\n \"acc_norm_stderr\": 0.024726967886647078\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959916,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959916\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3949579831932773,\n \"acc_stderr\": 0.03175367846096626,\n \
\ \"acc_norm\": 0.3949579831932773,\n \"acc_norm_stderr\": 0.03175367846096626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.42752293577981654,\n \"acc_stderr\": 0.021210910204300434,\n \"\
acc_norm\": 0.42752293577981654,\n \"acc_norm_stderr\": 0.021210910204300434\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.03114144782353603,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03114144782353603\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.36764705882352944,\n \"acc_stderr\": 0.03384132045674119,\n \"\
acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.03384132045674119\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.41350210970464135,\n \"acc_stderr\": 0.03205649904851858,\n \
\ \"acc_norm\": 0.41350210970464135,\n \"acc_norm_stderr\": 0.03205649904851858\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.40358744394618834,\n\
\ \"acc_stderr\": 0.032928028193303135,\n \"acc_norm\": 0.40358744394618834,\n\
\ \"acc_norm_stderr\": 0.032928028193303135\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.04328577215262972,\n\
\ \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.04328577215262972\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5702479338842975,\n \"acc_stderr\": 0.04519082021319774,\n \"\
acc_norm\": 0.5702479338842975,\n \"acc_norm_stderr\": 0.04519082021319774\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.04668408033024932,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.04668408033024932\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3619631901840491,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.3619631901840491,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4854368932038835,\n \"acc_stderr\": 0.049486373240266376,\n\
\ \"acc_norm\": 0.4854368932038835,\n \"acc_norm_stderr\": 0.049486373240266376\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5683760683760684,\n\
\ \"acc_stderr\": 0.0324483553531149,\n \"acc_norm\": 0.5683760683760684,\n\
\ \"acc_norm_stderr\": 0.0324483553531149\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488585,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488585\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5172413793103449,\n\
\ \"acc_stderr\": 0.017869330154003705,\n \"acc_norm\": 0.5172413793103449,\n\
\ \"acc_norm_stderr\": 0.017869330154003705\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3786127167630058,\n \"acc_stderr\": 0.026113749361310338,\n\
\ \"acc_norm\": 0.3786127167630058,\n \"acc_norm_stderr\": 0.026113749361310338\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2547486033519553,\n\
\ \"acc_stderr\": 0.014572650383409162,\n \"acc_norm\": 0.2547486033519553,\n\
\ \"acc_norm_stderr\": 0.014572650383409162\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4869281045751634,\n \"acc_stderr\": 0.028620130800700246,\n\
\ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.028620130800700246\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4115755627009646,\n\
\ \"acc_stderr\": 0.027950481494401255,\n \"acc_norm\": 0.4115755627009646,\n\
\ \"acc_norm_stderr\": 0.027950481494401255\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.027586006221607704,\n\
\ \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.027586006221607704\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.28368794326241137,\n \"acc_stderr\": 0.026891709428343968,\n \
\ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.026891709428343968\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3116036505867014,\n\
\ \"acc_stderr\": 0.011829039182849652,\n \"acc_norm\": 0.3116036505867014,\n\
\ \"acc_norm_stderr\": 0.011829039182849652\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2977941176470588,\n \"acc_stderr\": 0.027778298701545436,\n\
\ \"acc_norm\": 0.2977941176470588,\n \"acc_norm_stderr\": 0.027778298701545436\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.35130718954248363,\n \"acc_stderr\": 0.01931267606578656,\n \
\ \"acc_norm\": 0.35130718954248363,\n \"acc_norm_stderr\": 0.01931267606578656\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.42727272727272725,\n\
\ \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.42727272727272725,\n\
\ \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4122448979591837,\n \"acc_stderr\": 0.03151236044674281,\n\
\ \"acc_norm\": 0.4122448979591837,\n \"acc_norm_stderr\": 0.03151236044674281\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.47761194029850745,\n\
\ \"acc_stderr\": 0.03531987930208731,\n \"acc_norm\": 0.47761194029850745,\n\
\ \"acc_norm_stderr\": 0.03531987930208731\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.37349397590361444,\n\
\ \"acc_stderr\": 0.037658451171688624,\n \"acc_norm\": 0.37349397590361444,\n\
\ \"acc_norm_stderr\": 0.037658451171688624\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.49707602339181284,\n \"acc_stderr\": 0.03834759370936839,\n\
\ \"acc_norm\": 0.49707602339181284,\n \"acc_norm_stderr\": 0.03834759370936839\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.447487248289777,\n\
\ \"mc2_stderr\": 0.01450127878309395\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6535122336227308,\n \"acc_stderr\": 0.013373773411685655\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09476876421531463,\n \
\ \"acc_stderr\": 0.008067791560015414\n }\n}\n```"
repo_url: https://huggingface.co/pmking27/PrathameshLLM-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|arc:challenge|25_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|arc:challenge|25_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|gsm8k|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|gsm8k|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hellaswag|10_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hellaswag|10_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T00-25-27.435758.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T06-08-43.810683.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T06-08-43.810683.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- '**/details_harness|winogrande|5_2024-04-03T00-25-27.435758.parquet'
- split: 2024_04_03T06_08_43.810683
path:
- '**/details_harness|winogrande|5_2024-04-03T06-08-43.810683.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-03T06-08-43.810683.parquet'
- config_name: results
data_files:
- split: 2024_04_03T00_25_27.435758
path:
- results_2024-04-03T00-25-27.435758.parquet
- split: 2024_04_03T06_08_43.810683
path:
- results_2024-04-03T06-08-43.810683.parquet
- split: latest
path:
- results_2024-04-03T06-08-43.810683.parquet
---
# Dataset Card for Evaluation run of pmking27/PrathameshLLM-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [pmking27/PrathameshLLM-7B](https://huggingface.co/pmking27/PrathameshLLM-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pmking27__PrathameshLLM-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-03T06:08:43.810683](https://huggingface.co/datasets/open-llm-leaderboard/details_pmking27__PrathameshLLM-7B/blob/main/results_2024-04-03T06-08-43.810683.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3849806020290835,
"acc_stderr": 0.03414181792488407,
"acc_norm": 0.38869030790086967,
"acc_norm_stderr": 0.03493120175163532,
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.447487248289777,
"mc2_stderr": 0.01450127878309395
},
"harness|arc:challenge|25": {
"acc": 0.4283276450511945,
"acc_stderr": 0.014460496367599017,
"acc_norm": 0.4496587030716723,
"acc_norm_stderr": 0.014537144444284743
},
"harness|hellaswag|10": {
"acc": 0.506970722963553,
"acc_stderr": 0.0049892964711570715,
"acc_norm": 0.682832105158335,
"acc_norm_stderr": 0.004644223294727726
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3815789473684211,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.3815789473684211,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4339622641509434,
"acc_stderr": 0.030503292013342596,
"acc_norm": 0.4339622641509434,
"acc_norm_stderr": 0.030503292013342596
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3958333333333333,
"acc_stderr": 0.04089465449325583,
"acc_norm": 0.3958333333333333,
"acc_norm_stderr": 0.04089465449325583
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.040233822736177476,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.040233822736177476
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.03141082197596239,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.03141082197596239
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.022860838309232072,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.022860838309232072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147126,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.41935483870967744,
"acc_stderr": 0.02807158890109184,
"acc_norm": 0.41935483870967744,
"acc_norm_stderr": 0.02807158890109184
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.0333276906841079,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.0333276906841079
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.03756335775187897,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.03756335775187897
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.03547601494006937,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.03547601494006937
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5025906735751295,
"acc_stderr": 0.03608390745384487,
"acc_norm": 0.5025906735751295,
"acc_norm_stderr": 0.03608390745384487
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.38974358974358975,
"acc_stderr": 0.024726967886647078,
"acc_norm": 0.38974358974358975,
"acc_norm_stderr": 0.024726967886647078
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959916,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3949579831932773,
"acc_stderr": 0.03175367846096626,
"acc_norm": 0.3949579831932773,
"acc_norm_stderr": 0.03175367846096626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.42752293577981654,
"acc_stderr": 0.021210910204300434,
"acc_norm": 0.42752293577981654,
"acc_norm_stderr": 0.021210910204300434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03114144782353603,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03114144782353603
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.03384132045674119,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.03384132045674119
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.41350210970464135,
"acc_stderr": 0.03205649904851858,
"acc_norm": 0.41350210970464135,
"acc_norm_stderr": 0.03205649904851858
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.40358744394618834,
"acc_stderr": 0.032928028193303135,
"acc_norm": 0.40358744394618834,
"acc_norm_stderr": 0.032928028193303135
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4198473282442748,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.4198473282442748,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5702479338842975,
"acc_stderr": 0.04519082021319774,
"acc_norm": 0.5702479338842975,
"acc_norm_stderr": 0.04519082021319774
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04668408033024932,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04668408033024932
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3619631901840491,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.3619631901840491,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.4854368932038835,
"acc_stderr": 0.049486373240266376,
"acc_norm": 0.4854368932038835,
"acc_norm_stderr": 0.049486373240266376
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5683760683760684,
"acc_stderr": 0.0324483553531149,
"acc_norm": 0.5683760683760684,
"acc_norm_stderr": 0.0324483553531149
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488585,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488585
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.017869330154003705,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.017869330154003705
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3786127167630058,
"acc_stderr": 0.026113749361310338,
"acc_norm": 0.3786127167630058,
"acc_norm_stderr": 0.026113749361310338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2547486033519553,
"acc_stderr": 0.014572650383409162,
"acc_norm": 0.2547486033519553,
"acc_norm_stderr": 0.014572650383409162
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4115755627009646,
"acc_stderr": 0.027950481494401255,
"acc_norm": 0.4115755627009646,
"acc_norm_stderr": 0.027950481494401255
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.027586006221607704,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.027586006221607704
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.026891709428343968,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.026891709428343968
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3116036505867014,
"acc_stderr": 0.011829039182849652,
"acc_norm": 0.3116036505867014,
"acc_norm_stderr": 0.011829039182849652
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2977941176470588,
"acc_stderr": 0.027778298701545436,
"acc_norm": 0.2977941176470588,
"acc_norm_stderr": 0.027778298701545436
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35130718954248363,
"acc_stderr": 0.01931267606578656,
"acc_norm": 0.35130718954248363,
"acc_norm_stderr": 0.01931267606578656
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.42727272727272725,
"acc_stderr": 0.04738198703545483,
"acc_norm": 0.42727272727272725,
"acc_norm_stderr": 0.04738198703545483
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4122448979591837,
"acc_stderr": 0.03151236044674281,
"acc_norm": 0.4122448979591837,
"acc_norm_stderr": 0.03151236044674281
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.47761194029850745,
"acc_stderr": 0.03531987930208731,
"acc_norm": 0.47761194029850745,
"acc_norm_stderr": 0.03531987930208731
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-virology|5": {
"acc": 0.37349397590361444,
"acc_stderr": 0.037658451171688624,
"acc_norm": 0.37349397590361444,
"acc_norm_stderr": 0.037658451171688624
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.49707602339181284,
"acc_stderr": 0.03834759370936839,
"acc_norm": 0.49707602339181284,
"acc_norm_stderr": 0.03834759370936839
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.447487248289777,
"mc2_stderr": 0.01450127878309395
},
"harness|winogrande|5": {
"acc": 0.6535122336227308,
"acc_stderr": 0.013373773411685655
},
"harness|gsm8k|5": {
"acc": 0.09476876421531463,
"acc_stderr": 0.008067791560015414
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
alexcom/analisis-sentimientos-textos-turisticos-mx-test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 42973170
num_examples: 107863
download_size: 27066307
dataset_size: 42973170
---
# Dataset Card for "analisis-sentimiento-textos-turisitcos-mx"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Thanmay/arc-challenge-translated | ---
dataset_info:
- config_name: default
features:
- name: id
dtype: string
- name: answerKey
dtype: string
- name: itv2 hi
dtype: string
- name: question
dtype: string
- name: choices
struct:
- name: label
sequence: string
- name: text
sequence: string
splits:
- name: test
num_bytes: 1586189
num_examples: 1140
- name: validation
num_bytes: 412811
num_examples: 296
download_size: 738551
dataset_size: 1999000
- config_name: en
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: train
num_bytes: 349760
num_examples: 1119
- name: test
num_bytes: 375511
num_examples: 1172
- name: validation
num_bytes: 96660
num_examples: 299
download_size: 449460
dataset_size: 821931
- config_name: gu
features:
- name: id
dtype: string
- name: answerKey
dtype: string
- name: choices
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: question
dtype: string
splits:
- name: test
num_bytes: 786144
num_examples: 1172
- name: validation
num_bytes: 201280
num_examples: 299
download_size: 386979
dataset_size: 987424
- config_name: hi
features:
- name: id
dtype: string
- name: answerKey
dtype: string
- name: choices
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: question
dtype: string
splits:
- name: test
num_bytes: 825113
num_examples: 1172
- name: validation
num_bytes: 212198
num_examples: 299
download_size: 385800
dataset_size: 1037311
- config_name: ml
features:
- name: id
dtype: string
- name: answerKey
dtype: string
- name: choices
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: question
dtype: string
splits:
- name: test
num_bytes: 939883
num_examples: 1172
- name: validation
num_bytes: 242267
num_examples: 299
download_size: 426315
dataset_size: 1182150
- config_name: mr
features:
- name: id
dtype: string
- name: answerKey
dtype: string
- name: choices
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: question
dtype: string
splits:
- name: test
num_bytes: 818071
num_examples: 1172
- name: validation
num_bytes: 210479
num_examples: 299
download_size: 399224
dataset_size: 1028550
- config_name: ta
features:
- name: id
dtype: string
- name: answerKey
dtype: string
- name: choices
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: question
dtype: string
splits:
- name: test
num_bytes: 956879
num_examples: 1172
- name: validation
num_bytes: 244003
num_examples: 299
download_size: 424180
dataset_size: 1200882
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
- config_name: en
data_files:
- split: train
path: en/train-*
- split: test
path: en/test-*
- split: validation
path: en/validation-*
- config_name: gu
data_files:
- split: test
path: gu/test-*
- split: validation
path: gu/validation-*
- config_name: hi
data_files:
- split: test
path: hi/test-*
- split: validation
path: hi/validation-*
- config_name: ml
data_files:
- split: test
path: ml/test-*
- split: validation
path: ml/validation-*
- config_name: mr
data_files:
- split: test
path: mr/test-*
- split: validation
path: mr/validation-*
- config_name: ta
data_files:
- split: test
path: ta/test-*
- split: validation
path: ta/validation-*
---
|
psroy/mini-platypus-guanaco-one | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1319994
num_examples: 700
download_size: 750377
dataset_size: 1319994
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lowo/ncep-PZVWzw-piezometer-monitoring-data | ---
license: mit
---
|
ogimgio/starthack-supercell-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
0: Negative
1: Positive
splits:
- name: train
num_bytes: 34103600.0
num_examples: 206
- name: validation
num_bytes: 5764205.0
num_examples: 35
download_size: 35724512
dataset_size: 39867805.0
---
# Dataset Card for "starthack-supercell-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lucadiliello/duorc.paraphrasercqa | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: key
dtype: string
- name: labels
list:
- name: end
sequence: int64
- name: start
sequence: int64
splits:
- name: test
num_bytes: 5755609
num_examples: 1501
download_size: 660068
dataset_size: 5755609
---
# Dataset Card for "duorc.paraphrasercqa"
Split taken from the MRQA 2019 Shared Task, formatted and filtered for Question Answering. For the original dataset, have a look [here](https://huggingface.co/datasets/mrqa). |
vilm/MathPile-StackExchange | ---
dataset_info:
features:
- name: question
struct:
- name: Body
dtype: string
- name: ClosedDate
dtype: string
- name: FavoriteCount
dtype: string
- name: Id
dtype: string
- name: LastEditorDisplayName
dtype: string
- name: OwnerDisplayName
dtype: string
- name: Score
dtype: string
- name: Tags
dtype: string
- name: Title
dtype: string
- name: language_detection_score
dtype: float64
- name: answers
list:
- name: Body
dtype: string
- name: Id
dtype: string
- name: Score
dtype: string
- name: is_accepted_answer
dtype: bool
- name: language_detection_score
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 1500589950
num_examples: 264337
download_size: 828758533
dataset_size: 1500589950
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Atipico1/webq-top5_preprocessed_with_o-u_case | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: masked_query
dtype: string
- name: original_case
list:
- name: answer
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: question
dtype: string
- name: unans_case
list:
- name: answer
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 38611868
num_examples: 3778
- name: test
num_bytes: 20822239
num_examples: 2032
download_size: 30721612
dataset_size: 59434107
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
alisson40889/xita | ---
license: openrail
---
|
Riksarkivet/cleaned_Diachronic_swe | ---
dataset_info:
features:
- name: flatten_chunked_text
dtype: string
splits:
- name: test
num_bytes: 23478559.45919323
num_examples: 12237
- name: train
num_bytes: 1150443657.5408068
num_examples: 599610
download_size: 808495849
dataset_size: 1173922217
language:
- sv
---
# Dataset Card for "test_mini_kbuhist2_v6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/lethe_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lethe (Fire Emblem)
This is the dataset of lethe (Fire Emblem), containing 175 images and their tags.
The core tags of this character are `animal_ears, cat_ears, cat_girl, purple_eyes, orange_hair, facial_mark, short_hair, tail, cat_tail, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 175 | 188.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lethe_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 175 | 117.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lethe_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 372 | 225.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lethe_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 175 | 169.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lethe_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 372 | 305.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lethe_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lethe_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 57 |  |  |  |  |  | whisker_markings, 1girl, solo, choker, brown_belt, green_shorts, side_slit_shorts, looking_at_viewer, simple_background, bell, wrist_wrap, gloves, thigh_strap, bandaged_arm, white_background |
| 1 | 6 |  |  |  |  |  | blonde_hair, 1girl, blush, medium_breasts, nipples, nude, solo, choker, simple_background, whisker_markings, looking_at_viewer, navel, white_background |
| 2 | 6 |  |  |  |  |  | 1girl, cleavage_cutout, large_breasts, looking_at_viewer, simple_background, whisker_markings, black_bra, blush, cat_cutout, cat_lingerie, jingle_bell, navel, neck_bell, solo, bangs, black_panties, cat_ear_panties, frilled_bra, green_choker, side-tie_panties, white_background |
| 3 | 11 |  |  |  |  |  | 1girl, hetero, penis, solo_focus, 1boy, blush, large_breasts, nipples, whisker_markings, choker, mosaic_censoring, nude, open_mouth, uncensored, cum_on_breasts, facial |
| 4 | 5 |  |  |  |  |  | 1boy, 1girl, blush, girl_on_top, hetero, sex, solo_focus, vaginal, cowgirl_position, looking_at_viewer, medium_breasts, nipples, open_mouth, penis, pussy, whisker_markings, bar_censor, pov, bandaged_arm, choker, navel, nude, outdoors, pubic_hair, smile, sweat |
| 5 | 8 |  |  |  |  |  | kimono, whisker_markings, 1girl, bangs, hairband, solo, looking_at_viewer, official_alternate_costume, open_mouth, skirt, hair_ornament, smile, blonde_hair, closed_mouth, fingernails, full_body, green_hakama, jingle_bell, rope, sandals, sash, shiny_hair, simple_background, slit_pupils, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | whisker_markings | 1girl | solo | choker | brown_belt | green_shorts | side_slit_shorts | looking_at_viewer | simple_background | bell | wrist_wrap | gloves | thigh_strap | bandaged_arm | white_background | blonde_hair | blush | medium_breasts | nipples | nude | navel | cleavage_cutout | large_breasts | black_bra | cat_cutout | cat_lingerie | jingle_bell | neck_bell | bangs | black_panties | cat_ear_panties | frilled_bra | green_choker | side-tie_panties | hetero | penis | solo_focus | 1boy | mosaic_censoring | open_mouth | uncensored | cum_on_breasts | facial | girl_on_top | sex | vaginal | cowgirl_position | pussy | bar_censor | pov | outdoors | pubic_hair | smile | sweat | kimono | hairband | official_alternate_costume | skirt | hair_ornament | closed_mouth | fingernails | full_body | green_hakama | rope | sandals | sash | shiny_hair | slit_pupils | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------|:--------|:-------|:---------|:-------------|:---------------|:-------------------|:--------------------|:--------------------|:-------|:-------------|:---------|:--------------|:---------------|:-------------------|:--------------|:--------|:-----------------|:----------|:-------|:--------|:------------------|:----------------|:------------|:-------------|:---------------|:--------------|:------------|:--------|:----------------|:------------------|:--------------|:---------------|:-------------------|:---------|:--------|:-------------|:-------|:-------------------|:-------------|:-------------|:-----------------|:---------|:--------------|:------|:----------|:-------------------|:--------|:-------------|:------|:-----------|:-------------|:--------|:--------|:---------|:-----------|:-----------------------------|:--------|:----------------|:---------------|:--------------|:------------|:---------------|:-------|:----------|:-------|:-------------|:--------------|:-----------|
| 0 | 57 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | | | X | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | | | | | X | X | | | | | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | | X | | | | | | | | | | | | | X | | X | X | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | | X | | | | X | | | | | | X | | | X | X | X | X | X | | | | | | | | | | | | | | X | X | X | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | X | X | | | | | X | X | | | | | | | X | | | | | | | | | | | X | | X | | | | | | | | | | | X | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Nexdata/Indonesian_Speech_Data_by_Mobile_Phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Indonesian_Speech_Data_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/991?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
1285 Indonesian native speakers participated in the recording with authentic accent. The recorded script is designed by linguists and cover a wide range of topics including generic, interactive, on-board and home. The text is manually proofread with high accuracy. It matches with mainstream Android and Apple system phones. The data set can be applied for automatic speech recognition, and machine translation scenes.
For more details, please refer to the link: https://www.nexdata.ai/datasets/991?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Indonesian
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
alexshengzhili/capture24_rawlabel | ---
license: mit
---
|
open-llm-leaderboard/details_Gille__StrangeMerges_33-7B-slerp | ---
pretty_name: Evaluation run of Gille/StrangeMerges_33-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gille/StrangeMerges_33-7B-slerp](https://huggingface.co/Gille/StrangeMerges_33-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_33-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-07T15:02:53.786887](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_33-7B-slerp/blob/main/results_2024-03-07T15-02-53.786887.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6424005323753961,\n\
\ \"acc_stderr\": 0.03239992801501718,\n \"acc_norm\": 0.6438121457860717,\n\
\ \"acc_norm_stderr\": 0.03305710713466849,\n \"mc1\": 0.5312117503059975,\n\
\ \"mc1_stderr\": 0.017469364874577526,\n \"mc2\": 0.6808865049080421,\n\
\ \"mc2_stderr\": 0.015290247726218075\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6800341296928327,\n \"acc_stderr\": 0.013631345807016195,\n\
\ \"acc_norm\": 0.7073378839590444,\n \"acc_norm_stderr\": 0.013295916103619423\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7053375821549492,\n\
\ \"acc_stderr\": 0.004549591490046202,\n \"acc_norm\": 0.872634933280223,\n\
\ \"acc_norm_stderr\": 0.00332700135318693\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305526,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305526\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695482995,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695482995\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945627,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945627\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.01639943636661291,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.01639943636661291\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.033981108902946366,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.033981108902946366\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159267,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159267\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464076,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464076\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39329608938547483,\n\
\ \"acc_stderr\": 0.016337268694270102,\n \"acc_norm\": 0.39329608938547483,\n\
\ \"acc_norm_stderr\": 0.016337268694270102\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886324,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886324\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n\
\ \"acc_stderr\": 0.012734923579532074,\n \"acc_norm\": 0.46284224250325945,\n\
\ \"acc_norm_stderr\": 0.012734923579532074\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.027778298701545443,\n\
\ \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.027778298701545443\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5312117503059975,\n\
\ \"mc1_stderr\": 0.017469364874577526,\n \"mc2\": 0.6808865049080421,\n\
\ \"mc2_stderr\": 0.015290247726218075\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8168902920284136,\n \"acc_stderr\": 0.010869778633168367\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.579226686884003,\n \
\ \"acc_stderr\": 0.01359848949718284\n }\n}\n```"
repo_url: https://huggingface.co/Gille/StrangeMerges_33-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|arc:challenge|25_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|gsm8k|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hellaswag|10_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T15-02-53.786887.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T15-02-53.786887.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- '**/details_harness|winogrande|5_2024-03-07T15-02-53.786887.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-07T15-02-53.786887.parquet'
- config_name: results
data_files:
- split: 2024_03_07T15_02_53.786887
path:
- results_2024-03-07T15-02-53.786887.parquet
- split: latest
path:
- results_2024-03-07T15-02-53.786887.parquet
---
# Dataset Card for Evaluation run of Gille/StrangeMerges_33-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_33-7B-slerp](https://huggingface.co/Gille/StrangeMerges_33-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_33-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-07T15:02:53.786887](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_33-7B-slerp/blob/main/results_2024-03-07T15-02-53.786887.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6424005323753961,
"acc_stderr": 0.03239992801501718,
"acc_norm": 0.6438121457860717,
"acc_norm_stderr": 0.03305710713466849,
"mc1": 0.5312117503059975,
"mc1_stderr": 0.017469364874577526,
"mc2": 0.6808865049080421,
"mc2_stderr": 0.015290247726218075
},
"harness|arc:challenge|25": {
"acc": 0.6800341296928327,
"acc_stderr": 0.013631345807016195,
"acc_norm": 0.7073378839590444,
"acc_norm_stderr": 0.013295916103619423
},
"harness|hellaswag|10": {
"acc": 0.7053375821549492,
"acc_stderr": 0.004549591490046202,
"acc_norm": 0.872634933280223,
"acc_norm_stderr": 0.00332700135318693
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305526,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305526
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695482995,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695482995
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945627,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.01639943636661291,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.01639943636661291
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159267,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159267
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464076,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464076
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39329608938547483,
"acc_stderr": 0.016337268694270102,
"acc_norm": 0.39329608938547483,
"acc_norm_stderr": 0.016337268694270102
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886324,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886324
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532074,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532074
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.027778298701545443,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.027778298701545443
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5312117503059975,
"mc1_stderr": 0.017469364874577526,
"mc2": 0.6808865049080421,
"mc2_stderr": 0.015290247726218075
},
"harness|winogrande|5": {
"acc": 0.8168902920284136,
"acc_stderr": 0.010869778633168367
},
"harness|gsm8k|5": {
"acc": 0.579226686884003,
"acc_stderr": 0.01359848949718284
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
316usman/thematic3b_rr_embed | ---
dataset_info:
features:
- name: text
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 147202163
num_examples: 236995
download_size: 50815224
dataset_size: 147202163
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hoangphu7122002ai/base_wiki_general | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: gen
dtype: string
- name: len
dtype: int64
splits:
- name: train
num_bytes: 206633260.6888793
num_examples: 163035
download_size: 337419918
dataset_size: 206633260.6888793
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
archanatikayatray/ASRS-ChatGPT | ---
license: apache-2.0
task_categories:
- zero-shot-classification
- text-generation
- summarization
- question-answering
language:
- en
tags:
- ChatGPT
size_categories:
- 1K<n<10K
---
## Dataset Description
- **Paper:** Examining the Potential of Generative Language Models for Aviation Safety Analysis: Insights from ASRS Case Study
- **Point of Contact:** archanatikayatray@gmail.com
### Dataset Summary
The dataset contains a total of 9984 incident records and 9 columns. Some of the columns contain ground truth values whereas others contain information generated by ChatGPT based on the incident _**Narratives**_.
The creation of this dataset is aimed at providing researchers with columns generated by using ChatGPT API which is not freely available.
## Dataset Structure
The column names present in the dataset and their descriptions are provided below:
|Column Name|Description|Generated by|
| :----: | :----: | :----: |
ACN | Unique identifier for incident reports | - |
Narrative | Incident narrative | Reporter |
synopsis_groundtruth | Synopsis of the incident | Safety Analyst |
(GPT-3.5-turbo) Synopsis | Synopsis generated by ChatGPT based on narrative | ChatGPT |
human_factors_groundtruth | Human factor issues that contributed to the incident | Safety Analyst |
(GPT-3.5-turbo) Human Factor issue| Human factor issue that contributed to the incident identified by ChatGPT based on incident narrative | ChatGPT |
(GPT-3.5-turbo) Rationale - Human Factor issue | Rationale behind human factor issue identified by ChatGPT | ChatGPT |
(GPT-3.5-turbo) Incident attribution | Incident attribution identified by ChatGPT based on incident narrative | ChatGPT |
(GPT-3.5-turbo) Rationale - Incident attribution | Rationale behind incident attribution by ChatGPT | ChatGPT |
## Dataset Creation
### Source Data
The initial dataset was obtained from the Aviation Safety Reporting System (ASRS) database and comprises incident reports that encompass the time period from January 2009 to July 2022.
This was followed by retaining only the records where the _**Primary Problem**_ that led to the incident was _**Human Factors**_.
### Importing dataset into Python environment
Use the following code chunk to import the dataset into Python environment as a DataFrame.
```
from datasets import load_dataset
import pandas as pd
dataset = load_dataset("archanatikayatray/ASRS-ChatGPT")
#Converting the dataset into a pandas DataFrame
dataset = pd.DataFrame(dataset["train"])
dataset = dataset.astype({'ACN':'string'})
#Viewing the last 10 rows of the annotated dataset
dataset.tail(10)
```
### Limitations
Certain columns within this dataset include information generated by ChatGPT and therefore may not be entirely accurate. Consequently, it is advised to exercise caution when utilizing the generated data for decision making purposes.
### Citation Information
```
@Article{TikayatRay-ASRS,
AUTHOR = {Tikayat Ray, Archana and Bhat, Anirudh Prabhakara and White, Ryan T. and Nguyen, Van Minh and Pinon Fischer, Olivia J. and Mavris, Dimitri N.},
TITLE = {Examining the Potential of Generative Language Models for Aviation Safety Analysis: Case Study and Insights Using the Aviation Safety Reporting System (ASRS)},
JOURNAL = {Aerospace},
VOLUME = {10},
YEAR = {2023},
NUMBER = {9},
ARTICLE-NUMBER = {770},
URL = {https://www.mdpi.com/2226-4310/10/9/770},
ISSN = {2226-4310},
DOI = {10.3390/aerospace10090770}
}
``` |
HydraLM/clustered_1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
- name: embedding
sequence: float64
- name: text_processed
dtype: string
- name: __index_level_0__
dtype: int64
- name: cluster
sequence: int64
splits:
- name: train
num_bytes: 17476162280
num_examples: 1472917
download_size: 12523176003
dataset_size: 17476162280
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "clustered_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sidnarsipur/height_normal | ---
dataset_info:
features:
- name: basecolor
dtype: image
- name: height
dtype: image
- name: normal
dtype: image
- name: name
dtype: string
splits:
- name: train
num_bytes: 116003875964.9
num_examples: 96900
download_size: 116503011255
dataset_size: 116003875964.9
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
xingyaoww/code-act | ---
configs:
- config_name: default
data_files:
- split: codeact
path: data/codeact-*
- split: general
path: data/general-*
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: codeact
num_bytes: 34936511
num_examples: 7139
- name: general
num_bytes: 250817144
num_examples: 71246
download_size: 123084833
dataset_size: 285753655
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- llm-agent
- llm
- instruction-tuning
size_categories:
- 1K<n<10K
---
<h1 align="center"> Executable Code Actions Elicit Better LLM Agents </h1>
<p align="center">
<a href="https://github.com/xingyaoww/code-act">💻 Code</a>
•
<a href="https://arxiv.org/abs/2402.01030">📃 Paper</a>
•
<a href="https://huggingface.co/datasets/xingyaoww/code-act" >🤗 Data (CodeActInstruct)</a>
•
<a href="https://huggingface.co/xingyaoww/CodeActAgent-Mistral-7b-v0.1" >🤗 Model (CodeActAgent-Mistral-7b-v0.1)</a>
•
<a href="https://chat.xwang.dev/">🤖 Chat with CodeActAgent!</a>
</p>
We propose to use executable Python **code** to consolidate LLM agents’ **act**ions into a unified action space (**CodeAct**).
Integrated with a Python interpreter, CodeAct can execute code actions and dynamically revise prior actions or emit new actions upon new observations (e.g., code execution results) through multi-turn interactions.

## Why CodeAct?
Our extensive analysis of 17 LLMs on API-Bank and a newly curated benchmark [M<sup>3</sup>ToolEval](docs/EVALUATION.md) shows that CodeAct outperforms widely used alternatives like Text and JSON (up to 20% higher success rate). Please check our paper for more detailed analysis!

*Comparison between CodeAct and Text / JSON as action.*

*Quantitative results comparing CodeAct and {Text, JSON} on M<sup>3</sup>ToolEval.*
## 📁 CodeActInstruct
We collect an instruction-tuning dataset CodeActInstruct that consists of 7k multi-turn interactions using CodeAct. Dataset is release at [huggingface dataset 🤗](https://huggingface.co/datasets/xingyaoww/code-act). Please refer to the paper and [this section](#-data-generation-optional) for details of data collection.

*Dataset Statistics. Token statistics are computed using Llama-2 tokenizer.*
## 🪄 CodeActAgent
Trained on **CodeActInstruct** and general conversaions, **CodeActAgent** excels at out-of-domain agent tasks compared to open-source models of the same size, while not sacrificing generic performance (e.g., knowledge, dialog). We release two variants of CodeActAgent:
- **CodeActAgent-Mistral-7b-v0.1** (recommended, [model link](https://huggingface.co/xingyaoww/CodeActAgent-Mistral-7b-v0.1)): using Mistral-7b-v0.1 as the base model with 32k context window.
- **CodeActAgent-Llama-7b** ([model link](https://huggingface.co/xingyaoww/CodeActAgent-Llama-2-7b)): using Llama-2-7b as the base model with 4k context window.

*Evaluation results for CodeActAgent. ID and OD stand for in-domain and out-of-domain evaluation correspondingly. Overall averaged performance normalizes the MT-Bench score to be consistent with other tasks and excludes in-domain tasks for fair comparison.*
Please check out [our paper](TODO) and [code](https://github.com/xingyaoww/code-act) for more details about data collection, model training, and evaluation.
## 📚 Citation
```bibtex
@misc{wang2024executable,
title={Executable Code Actions Elicit Better LLM Agents},
author={Xingyao Wang and Yangyi Chen and Lifan Yuan and Yizhe Zhang and Yunzhu Li and Hao Peng and Heng Ji},
year={2024},
eprint={2402.01030},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
MatsuoDochiai/Richard | ---
license: openrail
---
|
DeKenny/curry_image_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 52345780.0
num_examples: 41
download_size: 52348538
dataset_size: 52345780.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
siddharthbulia/guanaco-llama2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 15401731
num_examples: 9846
download_size: 8983165
dataset_size: 15401731
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
spawn99/CornellMovieDialogCorpus | ---
language:
- en
license: mit
size_categories:
- 100K<n<1M
tags:
- movie dialog
- cornell
- conversation
- dialog
dataset_info:
features:
- name: lineID
dtype: string
- name: characterID
dtype: string
- name: movieID
dtype: string
- name: characterName
dtype: string
- name: utterance
dtype: string
splits:
- name: movie_lines
num_bytes: 29475700
num_examples: 304713
download_size: 14593268
dataset_size: 29475700
configs:
- config_name: default
data_files:
- split: movie_lines
path: data/movie_lines-*
---
Cornell Movie-Dialogs Corpus
Distributed together with:
"Chameleons in imagined conversations: A new approach to understanding coordination of linguistic style in dialogs"
Cristian Danescu-Niculescu-Mizil and Lillian Lee
Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics, ACL 2011.
(this paper is included in this zip file)
NOTE: If you have results to report on these corpora, please send email to cristian@cs.cornell.edu or llee@cs.cornell.edu so we can add you to our list of people using this data. Thanks!
Contents of this README:
A) Brief description
B) Files description
C) Details on the collection procedure
D) Contact
A) Brief description:
This corpus contains a metadata-rich collection of fictional conversations extracted from raw movie scripts:
- 220,579 conversational exchanges between 10,292 pairs of movie characters
- involves 9,035 characters from 617 movies
- in total 304,713 utterances
- movie metadata included:
- genres
- release year
- IMDB rating
- number of IMDB votes
- IMDB rating
- character metadata included:
- gender (for 3,774 characters)
- position on movie credits (3,321 characters)
B) Files description:
In all files the field separator is " +++$+++ "
- movie_titles_metadata.txt
- contains information about each movie title
- fields:
- movieID,
- movie title,
- movie year,
- IMDB rating,
- no. IMDB votes,
- genres in the format ['genre1','genre2',�,'genreN']
- movie_characters_metadata.txt
- contains information about each movie character
- fields:
- characterID
- character name
- movieID
- movie title
- gender ("?" for unlabeled cases)
- position in credits ("?" for unlabeled cases)
- movie_lines.txt
- contains the actual text of each utterance
- fields:
- lineID
- characterID (who uttered this phrase)
- movieID
- character name
- text of the utterance
- movie_conversations.txt
- the structure of the conversations
- fields
- characterID of the first character involved in the conversation
- characterID of the second character involved in the conversation
- movieID of the movie in which the conversation occurred
- list of the utterances that make the conversation, in chronological
order: ['lineID1','lineID2',�,'lineIDN']
has to be matched with movie_lines.txt to reconstruct the actual content
- raw_script_urls.txt
- the urls from which the raw sources were retrieved
C) Details on the collection procedure:
We started from raw publicly available movie scripts (sources acknowledged in
raw_script_urls.txt). In order to collect the metadata necessary for this study
and to distinguish between two script versions of the same movie, we automatically
matched each script with an entry in movie database provided by IMDB (The Internet
Movie Database; data interfaces available at http://www.imdb.com/interfaces). Some
amount of manual correction was also involved. When more than one movie with the same
title was found in IMBD, the match was made with the most popular title
(the one that received most IMDB votes)
After discarding all movies that could not be matched or that had less than 5 IMDB
votes, we were left with 617 unique titles with metadata including genre, release
year, IMDB rating and no. of IMDB votes and cast distribution. We then identified
the pairs of characters that interact and separated their conversations automatically
using simple data processing heuristics. After discarding all pairs that exchanged
less than 5 conversational exchanges there were 10,292 left, exchanging 220,579
conversational exchanges (304,713 utterances). After automatically matching the names
of the 9,035 involved characters to the list of cast distribution, we used the
gender of each interpreting actor to infer the fictional gender of a subset of
3,321 movie characters (we raised the number of gendered 3,774 characters through
manual annotation). Similarly, we collected the end credit position of a subset
of 3,321 characters as a proxy for their status.
D) Contact:
Please email any questions to: cristian@cs.cornell.edu (Cristian Danescu-Niculescu-Mizil) |
Arwa0/t2img_part2_sample | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 58008343.0
num_examples: 592
download_size: 57984866
dataset_size: 58008343.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kaleemWaheed/twitter_dataset_1713048377 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9761
num_examples: 22
download_size: 8632
dataset_size: 9761
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jlbaker361/flickr_humans_30k | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: src
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 12089068170.0
num_examples: 30000
download_size: 12060478344
dataset_size: 12089068170.0
---
# Dataset Card for "flickr_humans_30k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-thaisum-thaisum-7581c9-59349145365 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- thaisum
eval_info:
task: summarization
model: thanathorn/mt5-cpe-kmutt-thai-sentence-sum
metrics: []
dataset_name: thaisum
dataset_config: thaisum
dataset_split: test
col_mapping:
text: body
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: thanathorn/mt5-cpe-kmutt-thai-sentence-sum
* Dataset: thaisum
* Config: thaisum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Kantaka](https://huggingface.co/Kantaka) for evaluating this model. |
mwitiderrick/swahili | ---
task_categories:
- text-generation
language:
- sw
pretty_name: ' Swahili Corpus'
size_categories:
- 10M<n<100M
license: apache-2.0
---
# Swahili: CC-100: Monolingual Datasets from Web Crawl Data
This is a Swahili corpus obtained from [CC-100: Monolingual Datasets from Web Crawl Data
](https://data.statmt.org/cc-100/) |
open-llm-leaderboard/details_Sao10K__Frostwind-v2.1-m7 | ---
pretty_name: Evaluation run of Sao10K/Frostwind-v2.1-m7
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Frostwind-v2.1-m7](https://huggingface.co/Sao10K/Frostwind-v2.1-m7) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Frostwind-v2.1-m7\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-13T18:05:13.831443](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Frostwind-v2.1-m7/blob/main/results_2024-03-13T18-05-13.831443.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6355703786895196,\n\
\ \"acc_stderr\": 0.03252213474337212,\n \"acc_norm\": 0.6413393419348439,\n\
\ \"acc_norm_stderr\": 0.03317966825176469,\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.01618574435514491,\n \"mc2\": 0.4694233651221065,\n\
\ \"mc2_stderr\": 0.014345259981700451\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5793515358361775,\n \"acc_stderr\": 0.0144262112525084,\n\
\ \"acc_norm\": 0.6177474402730375,\n \"acc_norm_stderr\": 0.014200454049979277\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6363274248157738,\n\
\ \"acc_stderr\": 0.004800728138792392,\n \"acc_norm\": 0.8376817367058355,\n\
\ \"acc_norm_stderr\": 0.0036798891253998134\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335082,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335082\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.02956070739246571,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.02956070739246571\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099878,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099878\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001505,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001505\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n\
\ \"acc_stderr\": 0.014950103002475363,\n \"acc_norm\": 0.2759776536312849,\n\
\ \"acc_norm_stderr\": 0.014950103002475363\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n\
\ \"acc_stderr\": 0.012713845972358983,\n \"acc_norm\": 0.4530638852672751,\n\
\ \"acc_norm_stderr\": 0.012713845972358983\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000314,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000314\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330432,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330432\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30966952264381886,\n\
\ \"mc1_stderr\": 0.01618574435514491,\n \"mc2\": 0.4694233651221065,\n\
\ \"mc2_stderr\": 0.014345259981700451\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7868981846882399,\n \"acc_stderr\": 0.011508957690722762\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.38362395754359363,\n \
\ \"acc_stderr\": 0.013394238584938163\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Frostwind-v2.1-m7
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|arc:challenge|25_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|gsm8k|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hellaswag|10_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-05-13.831443.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-13T18-05-13.831443.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- '**/details_harness|winogrande|5_2024-03-13T18-05-13.831443.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-13T18-05-13.831443.parquet'
- config_name: results
data_files:
- split: 2024_03_13T18_05_13.831443
path:
- results_2024-03-13T18-05-13.831443.parquet
- split: latest
path:
- results_2024-03-13T18-05-13.831443.parquet
---
# Dataset Card for Evaluation run of Sao10K/Frostwind-v2.1-m7
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sao10K/Frostwind-v2.1-m7](https://huggingface.co/Sao10K/Frostwind-v2.1-m7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Frostwind-v2.1-m7",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-13T18:05:13.831443](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Frostwind-v2.1-m7/blob/main/results_2024-03-13T18-05-13.831443.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6355703786895196,
"acc_stderr": 0.03252213474337212,
"acc_norm": 0.6413393419348439,
"acc_norm_stderr": 0.03317966825176469,
"mc1": 0.30966952264381886,
"mc1_stderr": 0.01618574435514491,
"mc2": 0.4694233651221065,
"mc2_stderr": 0.014345259981700451
},
"harness|arc:challenge|25": {
"acc": 0.5793515358361775,
"acc_stderr": 0.0144262112525084,
"acc_norm": 0.6177474402730375,
"acc_norm_stderr": 0.014200454049979277
},
"harness|hellaswag|10": {
"acc": 0.6363274248157738,
"acc_stderr": 0.004800728138792392,
"acc_norm": 0.8376817367058355,
"acc_norm_stderr": 0.0036798891253998134
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778408,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778408
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335082,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335082
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.02956070739246571,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.02956070739246571
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099878,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099878
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699796,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001505,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001505
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475363,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475363
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4530638852672751,
"acc_stderr": 0.012713845972358983,
"acc_norm": 0.4530638852672751,
"acc_norm_stderr": 0.012713845972358983
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000314,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000314
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330432,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330432
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30966952264381886,
"mc1_stderr": 0.01618574435514491,
"mc2": 0.4694233651221065,
"mc2_stderr": 0.014345259981700451
},
"harness|winogrande|5": {
"acc": 0.7868981846882399,
"acc_stderr": 0.011508957690722762
},
"harness|gsm8k|5": {
"acc": 0.38362395754359363,
"acc_stderr": 0.013394238584938163
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
freshpearYoon/vr_train_free_5 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 7596160692
num_examples: 10000
download_size: 1233761082
dataset_size: 7596160692
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BangumiBase/mayochiki | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Mayo Chiki!
This is the image base of bangumi Mayo Chiki!, we detected 14 characters, 2133 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 155 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 7 | [Download](1/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 2 | 183 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 261 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 734 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 28 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 14 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 34 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 433 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 7 | [Download](9/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 10 | 134 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 12 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 7 | [Download](12/dataset.zip) |  |  |  |  |  |  |  | N/A |
| noise | 124 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
RIW/pokemon_longtail_512 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: watermark_flag
dtype: bool
splits:
- name: train
num_bytes: 133046512.0
num_examples: 825
download_size: 133035365
dataset_size: 133046512.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zxdzxdzxd/google | ---
license: mit
---
|
AdapterOcean/physics_dataset_standardized_cluster_1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 48679635
num_examples: 4357
download_size: 0
dataset_size: 48679635
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "physics_dataset_standardized_cluster_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jacques7103/AI_Test | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
splits:
- name: train
num_bytes: 3798600197.5
num_examples: 75750
- name: test
num_bytes: 1318679920.75
num_examples: 25250
download_size: 5059384061
dataset_size: 5117280118.25
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
paulofel12/jareb | ---
license: openrail
---
|
bigainlco/LooGLE | ---
license: cc-by-sa-4.0
task_categories:
- question-answering
- summarization
- text-generation
- fill-mask
language:
- en
- zh
tags:
- Long Context
pretty_name: d
--- |
FaalSa/dataH | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 57629
num_examples: 1
- name: validation
num_bytes: 58109
num_examples: 1
- name: test
num_bytes: 58589
num_examples: 1
download_size: 8393
dataset_size: 174327
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
EarthnDusk/Roiadan_Vanzey_Lycoris | ---
license: creativeml-openrail-m
task_categories:
- text-to-image
language:
- en
tags:
- lora
- lycoris
- locon
pretty_name: Roiadan Vanzey Lycoris
size_categories:
- 1K<n<10K
---
# Roi'adan V'anzey Lycoris
[](https://ko-fi.com/Z8Z8L4EO)
WE ARE PROUDLY SPONSORED BY: https://www.piratediffusion.com/
JULY IS PLURAL PRIDE MONTH - You all know who you are, and you shall fear no longer - you have space on CivitAI just as much as the rest of everyone else. Our goal is to create niche safe spaces for those like us. If you're not plural, neurodivergent - it's ok LOL - you're welcome to support and just download and enjoy our content!
If you want to learn more please go here: https://thepluralassociation.org/ and support us, because we're being fake claimed into oblivion for "not being ashamed".
Never be ashamed if you have quirks.
JOIN THE DISCORD AND DEMAND THINGS OF US:
https://discord.gg/5t2kYxt7An
JOIN OUR SUBREDDIT: https://www.reddit.com/r/earthndusk/
Listen to the music that we've made that goes with our art:
https://open.spotify.com/playlist/00R8x00YktB4u541imdSSf?si=b60d209385a74b38
MODEL AND LORA REQUEST FORM: https://forms.gle/aZNw9E78yfmSDnxdA |
ZhafranR/CC-ID-News | ---
license: cc
language:
- id
size_categories:
- 100K<n<1M
---
[Needs More Information]
# Dataset Card for Common Crawled Indonesia News
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [Needs More Information]
- **Repository:** [Needs More Information]
- **Paper:** [Needs More Information]
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
### Dataset Summary
[Needs More Information]
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
[Needs More Information]
## Dataset Structure
### Data Instances
[Needs More Information]
### Data Fields
[Needs More Information]
### Data Splits
[Needs More Information]
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
[Needs More Information] |
grimulkan/LimaRP-augmented | ---
license: unknown
tags:
- not-for-all-audiences
---
An augmented and further modified version of [LimaRP](https://huggingface.co/datasets/lemonilia/LimaRP) in Fastchat format, modified in the following ways:
- The first prompt is modified to add context and simple references to aspects of the conversation (OOC, use of emojis, content), include persona descriptions of the characters involved, scenario descriptions and content tags.
- Certain irrelevant tags removed from first prompt (4K, grammarchecked, etc.)
- Any placeholders replaced by randomly generated names from [Faker](https://pypi.org/project/Faker/), with proper introductions introduced in the first prompt.
- All split conversations were joined to train long-context models (you may need to re-split them to fit in context length if you are not doing this).
- The assistant never plays multiple characters and always plays only a single character consistently. The user may play multiple characters, and if this is the case, it is clearly explained in the first prompt. |
anan-2024/twitter_dataset_1713176034 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 77344
num_examples: 194
download_size: 47892
dataset_size: 77344
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
crumb/deduped-pile-askmistral-shard1-top1-in-4 | ---
dataset_info:
features:
- name: text
dtype: string
- name: pos
dtype: float64
splits:
- name: train
num_bytes: 5335090828.0
num_examples: 1002630
download_size: 3227201658
dataset_size: 5335090828.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
1,378,234,368 tokens (using the Llama tokenizer, ~1.18b gpt4 tokens) from a deduped pile raw shard, filter len<896, ask-llm ([“How to Train Data-Efficient LLMs”](https://arxiv.org/abs/2402.09668)) w/ [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2), keep top 1/4
```
{
"text": "Once upon a time...",
"pos": -5.654354325
}
``` |
AfonsoBiscaia/CH | ---
language:
- pt
pretty_name: ChTweets
--- |
open-llm-leaderboard/details_cgato__TheSpice-7b-FT-ExperimentalOrca | ---
pretty_name: Evaluation run of cgato/TheSpice-7b-FT-ExperimentalOrca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cgato/TheSpice-7b-FT-ExperimentalOrca](https://huggingface.co/cgato/TheSpice-7b-FT-ExperimentalOrca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cgato__TheSpice-7b-FT-ExperimentalOrca\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T22:01:50.628037](https://huggingface.co/datasets/open-llm-leaderboard/details_cgato__TheSpice-7b-FT-ExperimentalOrca/blob/main/results_2024-03-27T22-01-50.628037.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6316919671647624,\n\
\ \"acc_stderr\": 0.03239094387074013,\n \"acc_norm\": 0.6367741206840596,\n\
\ \"acc_norm_stderr\": 0.03304933651470867,\n \"mc1\": 0.37576499388004897,\n\
\ \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5486536435888252,\n\
\ \"mc2_stderr\": 0.01507950125704567\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6049488054607508,\n \"acc_stderr\": 0.014285898292938165,\n\
\ \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759084\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.646584345747859,\n\
\ \"acc_stderr\": 0.00477053405584105,\n \"acc_norm\": 0.8425612427803226,\n\
\ \"acc_norm_stderr\": 0.0036346959069096605\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n\
\ \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n\
\ \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n\
\ \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n\
\ \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n\
\ \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"\
acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n\
\ \"acc_stderr\": 0.024472243840895507,\n \"acc_norm\": 0.7548387096774194,\n\
\ \"acc_norm_stderr\": 0.024472243840895507\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593542,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593542\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396993,\n\
\ \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396993\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n\
\ \"acc_stderr\": 0.016197807956848047,\n \"acc_norm\": 0.8275229357798165,\n\
\ \"acc_norm_stderr\": 0.016197807956848047\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n\
\ \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233483,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233483\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516304,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516304\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294406994,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294406994\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917212,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917212\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40893854748603353,\n\
\ \"acc_stderr\": 0.016442830654715537,\n \"acc_norm\": 0.40893854748603353,\n\
\ \"acc_norm_stderr\": 0.016442830654715537\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667878,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900926,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900926\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.01273971155404571,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.01273971155404571\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.02448448716291397,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.02448448716291397\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37576499388004897,\n\
\ \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5486536435888252,\n\
\ \"mc2_stderr\": 0.01507950125704567\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7987371744277821,\n \"acc_stderr\": 0.011268519971577684\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3821076573161486,\n \
\ \"acc_stderr\": 0.013384173935648492\n }\n}\n```"
repo_url: https://huggingface.co/cgato/TheSpice-7b-FT-ExperimentalOrca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|arc:challenge|25_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|gsm8k|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hellaswag|10_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T22-01-50.628037.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T22-01-50.628037.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- '**/details_harness|winogrande|5_2024-03-27T22-01-50.628037.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T22-01-50.628037.parquet'
- config_name: results
data_files:
- split: 2024_03_27T22_01_50.628037
path:
- results_2024-03-27T22-01-50.628037.parquet
- split: latest
path:
- results_2024-03-27T22-01-50.628037.parquet
---
# Dataset Card for Evaluation run of cgato/TheSpice-7b-FT-ExperimentalOrca
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cgato/TheSpice-7b-FT-ExperimentalOrca](https://huggingface.co/cgato/TheSpice-7b-FT-ExperimentalOrca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cgato__TheSpice-7b-FT-ExperimentalOrca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T22:01:50.628037](https://huggingface.co/datasets/open-llm-leaderboard/details_cgato__TheSpice-7b-FT-ExperimentalOrca/blob/main/results_2024-03-27T22-01-50.628037.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6316919671647624,
"acc_stderr": 0.03239094387074013,
"acc_norm": 0.6367741206840596,
"acc_norm_stderr": 0.03304933651470867,
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5486536435888252,
"mc2_stderr": 0.01507950125704567
},
"harness|arc:challenge|25": {
"acc": 0.6049488054607508,
"acc_stderr": 0.014285898292938165,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759084
},
"harness|hellaswag|10": {
"acc": 0.646584345747859,
"acc_stderr": 0.00477053405584105,
"acc_norm": 0.8425612427803226,
"acc_norm_stderr": 0.0036346959069096605
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895507,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895507
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593542,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396993,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396993
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948492,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848047,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848047
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233483,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233483
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516304,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516304
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294406994,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294406994
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917212,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917212
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40893854748603353,
"acc_stderr": 0.016442830654715537,
"acc_norm": 0.40893854748603353,
"acc_norm_stderr": 0.016442830654715537
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667878,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900926,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.01273971155404571,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.01273971155404571
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.02448448716291397,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.02448448716291397
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5486536435888252,
"mc2_stderr": 0.01507950125704567
},
"harness|winogrande|5": {
"acc": 0.7987371744277821,
"acc_stderr": 0.011268519971577684
},
"harness|gsm8k|5": {
"acc": 0.3821076573161486,
"acc_stderr": 0.013384173935648492
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Sleoruiz/disc_cla_segunda-2 | ---
dataset_info:
features:
- name: text
dtype: 'null'
- name: inputs
struct:
- name: comision
dtype: string
- name: fecha_gaceta
dtype: string
- name: gaceta_numero
dtype: string
- name: name
dtype: string
- name: text
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
sequence: string
- name: annotation_agent
dtype: string
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 16832400
num_examples: 7327
download_size: 8416037
dataset_size: 16832400
---
# Dataset Card for "disc_cla_segunda-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hassansh/rte_n_shot | ---
dataset_info:
features:
- name: input
dtype: string
- name: target_str
dtype: string
- name: target
dtype: int64
splits:
- name: 0_shot
num_bytes: 99434
num_examples: 277
- name: 1_shot
num_bytes: 185054
num_examples: 277
- name: 2_shot
num_bytes: 271196
num_examples: 277
- name: 3_shot
num_bytes: 359507
num_examples: 277
- name: 4_shot
num_bytes: 447008
num_examples: 277
- name: 5_shot
num_bytes: 531554
num_examples: 277
download_size: 531160
dataset_size: 1893753
configs:
- config_name: 0_shot
data_files:
- split: test
path: data/0_shot-*
- config_name: 1_shot
data_files:
- split: test
path: data/1_shot-*
- config_name: 2_shot
data_files:
- split: test
path: data/2_shot-*
- config_name: 3_shot
data_files:
- split: test
path: data/3_shot-*
- config_name: 4_shot
data_files:
- split: test
path: data/4_shot-*
- config_name: 5_shot
data_files:
- split: test
path: data/5_shot-*
---
|
anton-l/earnings22_baseline_5_gram | ---
license: apache-2.0
---
|
cardiffnlp/relentless | ---
language:
- en
license:
- other
multilinguality:
- monolingual
size_categories:
- n<1K
pretty_name: relentless
---
# Dataset Card for "cardiffnlp/relentless"
***RelEntLess*** is a new benchmark, in which entity pairs have to be ranked according to how much they satisfy a given graded relation.
Essentially, the task is a ranking task where we provide five prototypical examples to each relation. Following brief description of each relation type
is used in our baseline in addition to the prototypical examples.
Please check our paper "[A RelEntLess Benchmark for Modelling Graded Relations between Named Entities](https://arxiv.org/abs/2305.15002)" for more detail.
```python
{
"friend/ally of": "entities that are friends or allies",
"competitor/rival of": "entities that are competitors or rivals",
"known for": "examples of what entities are known for",
"influenced by": "what has influenced different entities",
"similar to": "examples of entities that are similar"
}
```
## Dataset Description
- **Repository:** [https://huggingface.co/datasets/cardiffnlp/relentless](https://huggingface.co/datasets/cardiffnlp/relentless)
- **Paper:** [A RelEntLess Benchmark for Modelling Graded Relations between Named Entities](https://arxiv.org/abs/2305.15002)
- **Dataset:** [https://huggingface.co/datasets/cardiffnlp/relentless](https://huggingface.co/datasets/cardiffnlp/relentless)
### Dataset Summary
| relation_type | val. | test |
|:--------------------|-------:|-------:|
| competitor/rival of | 20 | 84 |
| friend/ally of | 19 | 88 |
| influenced by | 19 | 90 |
| known for | 18 | 105 |
| similar to | 19 | 89 |
## Dataset Structure
### Data Instances
```python
{
"pairs": [["Le Corbusier", "purism art"], ["Sean Connery", "Finding Forrester"], ...],
"scores_all": [[4.0, 5.0, 3.0, 4.0, 5.0, 3.0, 5.0], [4.0, 5.0, 2, 5.0, 5.0, 4.0, 2], ...],
"scores_mean": [4.142857142857143, 3.857142857142857, 4.857142857142857, ...],
"relation_type": "known for",
"ranks": [8.5, 11, 5, 14, 15, 5, 20, 13, 1.5, 18, 10, 1.5, 17, ...],
"prototypical_examples": [ [ "Russell Crowe", "Gladiator" ], [ "Cadbury", "chocolate" ],...]
}
```
### Citation Information
```
@misc{ushio2023relentless,
title={A RelEntLess Benchmark for Modelling Graded Relations between Named Entities},
author={Asahi Ushio and Jose Camacho Collados and Steven Schockaert},
year={2023},
eprint={2305.15002},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
distilled-from-one-sec-cv12/chunk_206 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1106726064
num_examples: 215652
download_size: 1131701393
dataset_size: 1106726064
---
# Dataset Card for "chunk_206"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LeoZhangzaolin/Graptoloidea-Specimens-Imaging | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- image-classification
- object-detection
pretty_name: GraptoloideaSpecimensDataset
tags:
- graptoloidea
- paleontology
- specimens
- fossils
- biology
- earth-science
dataset_info:
features:
- name: Suborder
dtype: string
- name: Infraorder
dtype: string
- name: Family (Subfamily)
dtype: string
- name: Genus
dtype: string
- name: tagged species name
dtype: string
- name: image
dtype: string
- name: Stage
dtype: string
- name: mean age value
dtype: float64
- name: Locality (Longitude, Latitude, Horizon)
dtype: string
- name: Reference (specimens firstly published)
dtype: string
splits:
- name: train
num_bytes: 44749
num_examples: 977
- name: test
num_bytes: 22835
num_examples: 209
- name: validation
num_bytes: 22221
num_examples: 211
download_size: 87686
dataset_size: 839092
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
# Dataset Card for Graptoloidea Specimens Imaging
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Summary](#dataset-summary)
- [Dataset Preprocessing](#dataset-preprocessing)
- [Dataset Description](#dataset-description)
- [Supported Tasks](#supported-tasks)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instance](#data-instance)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Data Processing](#data-processing)
- [Bias, Risks, and Limitations](#bias-risks-and-limitations)
- [Citation](#citation)
## Dataset Summary
This dataset offers a detailed examination of Graptoloidea specimens, featuring attributes like image file paths, suborder, infraorder, family (including subfamily), tagged species names, geological stages, mean age values, and locality details (with coordinates and horizon information), complemented by original reference citations for each specimen. It serves as a comprehensive resource for paleontological research, emphasizing morphological and stratigraphic analysis of these ancient colonial animals.
## Dataset Preprocessing
This dataset doesn't download the images locally by default. Instead, it exposes URLs to the images. To fetch the images, use the following code (Make sure in the correct environment):
```python
from concurrent.futures import ThreadPoolExecutor
from functools import partial
import io
import urllib
import PIL.Image
from datasets import load_dataset
from datasets.utils.file_utils import get_datasets_user_agent
USER_AGENT = get_datasets_user_agent()
def fetch_single_image(image_url, timeout=None, retries=0):
for _ in range(retries + 1):
try:
request = urllib.request.Request(
image_url,
data=None,
headers={"user-agent": USER_AGENT},
)
with urllib.request.urlopen(request, timeout=timeout) as req:
image = PIL.Image.open(io.BytesIO(req.read()))
break
except Exception:
image = None
return image
def fetch_images(batch, num_threads, timeout=None, retries=0):
fetch_single_image_with_args = partial(fetch_single_image, timeout=timeout, retries=retries)
with ThreadPoolExecutor(max_workers=num_threads) as executor:
batch["image"] = list(executor.map(fetch_single_image_with_args, batch["image"]))
return batch
num_threads = 20
dset = load_dataset('Graptolodiea-Speciemens-Imaging.py')
dset = dset.map(fetch_images, batched=True, batch_size=100, fn_kwargs={"num_threads": num_threads})
```
## Dataset description
### Supported Tasks
- **Paleontological Analysis and Classification**: Utilizing the dataset for detailed classification of Graptoloidea species, including sorting by suborder, infraorder, and family. Fundamental for researchers in paleontology to understand evolutionary trends and species distinctions.
- **Age Estimation and Stratigraphic Correlation**: Leveraging mean age values and stage data to estimate the geological age of specimens and correlate them with stratigraphic layers. Crucial for geologists and paleontologists in mapping the geological timeline and understanding the Earth's history.
- **Geographical Distribution Study**: Analyzing locality data to study the geographical distribution and migration patterns of Graptoloidea species. Can reveal insights into ancient ecological conditions and biogeographic events.
- **Morphological Analysis**: Using the provided specimen images for morphological studies, enabling the identification of unique features and variations within the Graptoloidea order. Important for taxonomic classification and evolutionary studies.
- **Data-Driven Paleobiology**: Applying machine learning and statistical methods to uncover patterns and relationships in Graptoloidea evolution, diversity, and extinction events.
- **Morphometric Analysis**: Employing image processing techniques to measure and analyze morphological features of the specimens, such as length, shape, branching patterns, and other key characteristics.
- **Virtual Reconstruction and 3D Modeling**: Using specimen images to create detailed 3D models of Graptoloidea for virtual reality experiences, aiding in both research and educational endeavors.
- **Educational and Outreach Tools**: Developing interactive tools and applications for educational purposes, using specimen images to engage and teach students and the public about Graptoloidea and paleontology.
- **Crowdsourcing and Citizen Science Projects**: Allowing citizen scientists to access and annotate the images, contributing to data collection and analysis efforts.
### Language
- **English**
## Dataset Structure
### Data instance
```
{
'Suborder': 'Axonophora Frech, 1897',
'Infraorder': 'Neograptina Štorch & others, 2011',
'Family (Subfamily)': 'Dimorphograptidae Elles & Wood, 1908 (no subfamily)',
'Genus': 'Akidograptus',
'tagged species name': 'Akidograptus ascensus',
'image': 'https://raw.githubusercontent.com/LeoZhangzaolin/photos/main/14545Akidograptus_ascensus.jpg',
'Stage': 'Rhuddanian, Llandovery (early Silurian)',
'mean age value': 442.3,
'Locality (Longitude, Latitude, Horizon)': 'Huangshu Village in Anji County, Zhejiang Province (119.676, 30.608, Lower Silurian)',
'Reference (specimens firstly published)': 'Yang, D.Q. 1964. Some Lower Silurian graptolites from Anji, northwestern Zhejiang (Chekiang). Acta Palaeontologica Sinica, 12(4): 628-635.'
}
```
### Data Fields
- `Suborder` (string): Suborder of the graptoloidea.
- `Infraorder` (string): Infraorder of the graptoloidea.
- `Family (Subfamily)` (string): Family of the graptoloidea (with subfamily).
- `Genus` (string): genus of the graptoloidea
- `tagged species name` (string): The tagged name for the graptoloidea.
- `image` (string): File path for the image file.
- `Stage` (string): Which period is this graptoloidea in.
- `mean age value` (float): About how long has this specimen lasts.
- `Locality (Longitude, Latitude, Horizon)` (str): Where found the specimen (with longitude, latitude, and horizon).
- `Reference (specimens firstly published` (str): Who first publish this specimen.
### Data Splits
70% of data in training set; 15% of data in testing set; 15% of data in validation set.
## Dataset Creation
### Curation Rationale
The primary objective of curating the Graptoloidea Specimens dataset is to provide a comprehensive and accessible resource for the study and analysis of Graptoloidea, an order of extinct marine colonial organisms. This dataset is intended to support a wide range of scientific endeavors, including paleobiological research, evolutionary studies, and educational purposes in the field of paleontology. By assembling high-resolution images and detailed taxonomic information, the dataset aims to facilitate in-depth investigations into the morphology and classification of these ancient organisms, contributing to a broader understanding of Earth's geological history and biodiversity.
### Source Data
https://zenodo.org/records/6194943
### Data Processing
Specific processing explainition and code in CSV_Processing.py
## Bias, Risks, and Limitations
- **Technological and Methodological Constraints**: Utility is tied to current state of paleontological methodologies and technologies. Future advancements might necessitate reevaluation.
- **External Environmental Factors**: Might not fully account for the environmental conditions under which the specimens lived or were fossilized.
- **Imaging and Interpretation Limitations**: Quality and resolution of specimen images can vary, affecting morphological analyses. Interpretations based on images are subject to observer's expertise and bias.
## Citation
DOI: [10.5281/zenodo.5205215](https://doi.org/10.5281/zenodo.5205215) |
tarsomcareen/storage | ---
license: cc-by-2.0
---
|
bigscience-data/roots_indic-pa_wikisource | ---
language: pa
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-pa_wikisource
# wikisource_filtered
- Dataset uid: `wikisource_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 2.6306 % of total
- 12.7884 % of fr
- 19.8886 % of indic-bn
- 20.9966 % of indic-ta
- 2.3478 % of ar
- 4.7068 % of indic-hi
- 18.0998 % of indic-te
- 1.7155 % of es
- 19.4800 % of indic-kn
- 9.1737 % of indic-ml
- 17.1771 % of indic-mr
- 17.1870 % of indic-gu
- 70.3687 % of indic-as
- 1.0165 % of pt
- 7.8642 % of indic-pa
- 1.3501 % of vi
- 4.9411 % of indic-or
- 0.5307 % of ca
- 2.3593 % of id
- 1.5928 % of eu
### BigScience processing steps
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: indic-bn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: ar
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: indic-kn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- remove_wiki_mojibake
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-as
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-pa
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: vi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-or
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
#### Filters applied to: ca
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: id
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
|
liuyanchen1015/MULTI_VALUE_stsb_their_they | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 6524
num_examples: 33
- name: test
num_bytes: 2601
num_examples: 13
- name: train
num_bytes: 12823
num_examples: 61
download_size: 23932
dataset_size: 21948
---
# Dataset Card for "MULTI_VALUE_stsb_their_they"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
serbog/all_job_listings_cleaned | ---
dataset_info:
features:
- name: job_id
dtype: string
- name: description
dtype: string
- name: title
dtype: string
- name: creationdate
dtype: int64
- name: name
dtype: string
- name: location_codes
sequence: string
splits:
- name: train
num_bytes: 2991442359
num_examples: 1687761
download_size: 1434224117
dataset_size: 2991442359
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "all_job_listings_cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
onyou611/ko-nms-data | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7963
num_examples: 23
dataset_size: 7963
license: apache-2.0
task_categories:
- text-generation
language:
- ko
--- |
liuyanchen1015/MULTI_VALUE_sst2_invariant_tag_non_concord | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 229
num_examples: 2
- name: test
num_bytes: 468
num_examples: 4
- name: train
num_bytes: 8908
num_examples: 88
download_size: 11875
dataset_size: 9605
---
# Dataset Card for "MULTI_VALUE_sst2_invariant_tag_non_concord"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
maxolotl/must-c-en-es-wait3-02 | ---
dataset_info:
features:
- name: current_source
dtype: string
- name: current_target
dtype: string
- name: target_token
dtype: string
splits:
- name: train
num_bytes: 995120593
num_examples: 5240243
- name: test
num_bytes: 9960448
num_examples: 57187
- name: validation
num_bytes: 5429701
num_examples: 27549
download_size: 184348036
dataset_size: 1010510742
---
# Dataset Card for "must-c-en-es-wait3-02"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PerceptionEval/Depth | ---
dataset_info:
features:
- name: idx
dtype: int32
- name: question
dtype: string
- name: image_1
dtype: image
- name: choices
sequence: string
- name: answer
dtype: string
- name: prompt
dtype: string
splits:
- name: val
num_bytes: 11207430.0
num_examples: 124
- name: test
num_bytes: 11111927.0
num_examples: 124
download_size: 22271776
dataset_size: 22319357.0
configs:
- config_name: default
data_files:
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
CyberHarem/shiroko_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shiroko/砂狼シロコ/白子 (Blue Archive)
This is the dataset of shiroko/砂狼シロコ/白子 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `animal_ears, grey_hair, wolf_ears, animal_ear_fluff, blue_eyes, halo, hair_ornament, cross_hair_ornament, mismatched_pupils, extra_ears, medium_hair, breasts, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.25 GiB | [Download](https://huggingface.co/datasets/CyberHarem/shiroko_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 1014.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiroko_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1392 | 2.15 GiB | [Download](https://huggingface.co/datasets/CyberHarem/shiroko_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shiroko_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 36 |  |  |  |  |  | 1girl, official_alternate_costume, short_sleeves, solo, looking_at_viewer, bike_shorts, green_gloves, blush, sweat, outdoors, bicycle, ass, water_bottle, holding_bottle, blue_sky, parted_lips |
| 1 | 5 |  |  |  |  |  | 1girl, assault_rifle, black_skirt, blue_scarf, green_gloves, long_sleeves, looking_at_viewer, open_jacket, outdoors, plaid_skirt, pleated_skirt, school_uniform, solo, white_shirt, white_socks, bag, black_footwear, black_jacket, blazer, blue_jacket, closed_mouth, cloud, full_body, kneehighs, sig_sauer, blue_sky, blush, day, necktie, bicycle, hair_between_eyes, holding_gun, sneakers, standing |
| 2 | 7 |  |  |  |  |  | 1girl, blazer, blue_scarf, blue_sky, cloud, day, long_sleeves, looking_at_viewer, open_jacket, outdoors, pleated_skirt, school_uniform, solo, blue_jacket, blue_necktie, hair_between_eyes, parted_lips, plaid_skirt, white_shirt, black_skirt, green_gloves, school_bag, blush, building, shirt_tucked_in, shoulder_bag |
| 3 | 6 |  |  |  |  |  | 1girl, assault_rifle, blue_scarf, closed_mouth, long_sleeves, looking_at_viewer, open_jacket, plaid_skirt, pleated_skirt, school_uniform, sig_sauer, solo, white_shirt, black_skirt, blazer, blue_jacket, green_gloves, outdoors, single_glove, bag, blue_necktie, id_card, shirt_tucked_in, sky |
| 4 | 7 |  |  |  |  |  | 1girl, black_skirt, blue_scarf, closed_mouth, long_sleeves, looking_at_viewer, open_jacket, plaid_skirt, pleated_skirt, school_bag, school_uniform, shoulder_bag, solo, white_shirt, blue_jacket, blue_necktie, green_gloves, simple_background, white_background, blazer, blush, cowboy_shot, single_glove, hair_between_eyes, shirt_tucked_in |
| 5 | 12 |  |  |  |  |  | 1girl, black_skirt, blue_scarf, long_sleeves, looking_at_viewer, open_jacket, plaid_skirt, pleated_skirt, school_uniform, solo, white_shirt, assault_rifle, green_gloves, sig_sauer, white_background, simple_background, blazer, blue_jacket, closed_mouth, single_glove, blue_necktie, holding_gun, white_socks, black_footwear, blush, hair_between_eyes, kneehighs, school_bag, sneakers |
| 6 | 7 |  |  |  |  |  | 1girl, blazer, blue_necktie, blue_scarf, looking_at_viewer, open_jacket, school_uniform, solo, upper_body, white_shirt, black_jacket, long_sleeves, simple_background, white_background, blush, closed_mouth, hair_between_eyes, parted_lips |
| 7 | 18 |  |  |  |  |  | 1girl, competition_swimsuit, covered_navel, looking_at_viewer, official_alternate_costume, outdoors, solo, black_one-piece_swimsuit, blue_sky, cloud, day, cowboy_shot, multicolored_swimsuit, highleg_swimsuit, low_ponytail, water, wet, standing, armpits, thighs, arms_up, blush, ocean, arms_behind_head, earrings, wading |
| 8 | 7 |  |  |  |  |  | 1girl, black_one-piece_swimsuit, competition_swimsuit, cowboy_shot, looking_at_viewer, official_alternate_costume, simple_background, solo, white_background, covered_navel, highleg_swimsuit, multicolored_swimsuit, bag, low_ponytail, thighs, bare_arms, bare_shoulders |
| 9 | 46 |  |  |  |  |  | 1girl, solo, looking_at_viewer, large_breasts, black_choker, cleavage, hair_between_eyes, black_dress, collarbone, ahoge, long_sleeves, thigh_strap, very_long_hair, black_gloves, parted_lips, simple_background, alternate_costume |
| 10 | 7 |  |  |  |  |  | blush, enmaided, looking_at_viewer, maid_apron, white_apron, 1girl, black_dress, closed_mouth, frilled_apron, frilled_dress, maid_headdress, puffy_short_sleeves, solo, simple_background, waist_apron, white_background, black_footwear, blue_bowtie, cleavage, holding, indoors, standing, white_thighhighs, wrist_cuffs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | official_alternate_costume | short_sleeves | solo | looking_at_viewer | bike_shorts | green_gloves | blush | sweat | outdoors | bicycle | ass | water_bottle | holding_bottle | blue_sky | parted_lips | assault_rifle | black_skirt | blue_scarf | long_sleeves | open_jacket | plaid_skirt | pleated_skirt | school_uniform | white_shirt | white_socks | bag | black_footwear | black_jacket | blazer | blue_jacket | closed_mouth | cloud | full_body | kneehighs | sig_sauer | day | necktie | hair_between_eyes | holding_gun | sneakers | standing | blue_necktie | school_bag | building | shirt_tucked_in | shoulder_bag | single_glove | id_card | sky | simple_background | white_background | cowboy_shot | upper_body | competition_swimsuit | covered_navel | black_one-piece_swimsuit | multicolored_swimsuit | highleg_swimsuit | low_ponytail | water | wet | armpits | thighs | arms_up | ocean | arms_behind_head | earrings | wading | bare_arms | bare_shoulders | large_breasts | black_choker | cleavage | black_dress | collarbone | ahoge | thigh_strap | very_long_hair | black_gloves | alternate_costume | enmaided | maid_apron | white_apron | frilled_apron | frilled_dress | maid_headdress | puffy_short_sleeves | waist_apron | blue_bowtie | holding | indoors | white_thighhighs | wrist_cuffs |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------------------------|:----------------|:-------|:--------------------|:--------------|:---------------|:--------|:--------|:-----------|:----------|:------|:---------------|:-----------------|:-----------|:--------------|:----------------|:--------------|:-------------|:---------------|:--------------|:--------------|:----------------|:-----------------|:--------------|:--------------|:------|:-----------------|:---------------|:---------|:--------------|:---------------|:--------|:------------|:------------|:------------|:------|:----------|:--------------------|:--------------|:-----------|:-----------|:---------------|:-------------|:-----------|:------------------|:---------------|:---------------|:----------|:------|:--------------------|:-------------------|:--------------|:-------------|:-----------------------|:----------------|:---------------------------|:------------------------|:-------------------|:---------------|:--------|:------|:----------|:---------|:----------|:--------|:-------------------|:-----------|:---------|:------------|:-----------------|:----------------|:---------------|:-----------|:--------------|:-------------|:--------|:--------------|:-----------------|:---------------|:--------------------|:-----------|:-------------|:--------------|:----------------|:----------------|:-----------------|:----------------------|:--------------|:--------------|:----------|:----------|:-------------------|:--------------|
| 0 | 36 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | | X | X | | X | X | | X | X | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | X | X | | X | X | | X | | | | | X | X | | X | X | X | X | X | X | X | X | | | | | X | X | | X | | | | X | | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | | X | X | | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | | X | | | X | X | X | | | | X | | | | | | | X | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | X | X | | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | X | X | X | | | | | | | X | | | | X | X | | X | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 12 |  |  |  |  |  | X | | | X | X | | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | X | | X | X | X | | | X | X | | | X | X | X | | X | X | | | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | | | X | X | | | X | | | | | | | | X | | | X | X | X | | | X | X | | | | X | X | | X | | | | | | | X | | | | X | | | | | | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 18 |  |  |  |  |  | X | X | | X | X | | | X | | X | | | | | X | | | | | | | | | | | | | | | | | | X | | | | X | | | | | X | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | X | X | X | X | X | X | | | | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 46 |  |  |  |  |  | X | | | X | X | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 10 | 7 |  |  |  |  |  | X | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | X | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_decruz07__llama-2-7b-miniguanaco | ---
pretty_name: Evaluation run of decruz07/llama-2-7b-miniguanaco
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [decruz07/llama-2-7b-miniguanaco](https://huggingface.co/decruz07/llama-2-7b-miniguanaco)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_decruz07__llama-2-7b-miniguanaco\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-10T16:23:25.560074](https://huggingface.co/datasets/open-llm-leaderboard/details_decruz07__llama-2-7b-miniguanaco/blob/main/results_2024-01-10T16-23-25.560074.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4622895077291068,\n\
\ \"acc_stderr\": 0.03447917370505138,\n \"acc_norm\": 0.4668682561630037,\n\
\ \"acc_norm_stderr\": 0.03525354072650985,\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.43733395896519406,\n\
\ \"mc2_stderr\": 0.01449344801677889\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4522184300341297,\n \"acc_stderr\": 0.014544519880633832,\n\
\ \"acc_norm\": 0.4906143344709898,\n \"acc_norm_stderr\": 0.014608816322065\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5611431985660227,\n\
\ \"acc_stderr\": 0.004952332378120329,\n \"acc_norm\": 0.7559251145190201,\n\
\ \"acc_norm_stderr\": 0.004286594977390901\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5132075471698113,\n \"acc_stderr\": 0.030762134874500482,\n\
\ \"acc_norm\": 0.5132075471698113,\n \"acc_norm_stderr\": 0.030762134874500482\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4930555555555556,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.4930555555555556,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3815028901734104,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.3815028901734104,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835362,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835362\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.023068188848261128,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.023068188848261128\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604675,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604675\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5225806451612903,\n\
\ \"acc_stderr\": 0.02841498501970786,\n \"acc_norm\": 0.5225806451612903,\n\
\ \"acc_norm_stderr\": 0.02841498501970786\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3891625615763547,\n \"acc_stderr\": 0.03430462416103872,\n\
\ \"acc_norm\": 0.3891625615763547,\n \"acc_norm_stderr\": 0.03430462416103872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5636363636363636,\n \"acc_stderr\": 0.03872592983524754,\n\
\ \"acc_norm\": 0.5636363636363636,\n \"acc_norm_stderr\": 0.03872592983524754\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5505050505050505,\n \"acc_stderr\": 0.035441324919479704,\n \"\
acc_norm\": 0.5505050505050505,\n \"acc_norm_stderr\": 0.035441324919479704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6994818652849741,\n \"acc_stderr\": 0.0330881859441575,\n\
\ \"acc_norm\": 0.6994818652849741,\n \"acc_norm_stderr\": 0.0330881859441575\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.38974358974358975,\n \"acc_stderr\": 0.024726967886647078,\n\
\ \"acc_norm\": 0.38974358974358975,\n \"acc_norm_stderr\": 0.024726967886647078\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514568,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514568\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3907563025210084,\n \"acc_stderr\": 0.03169380235712997,\n \
\ \"acc_norm\": 0.3907563025210084,\n \"acc_norm_stderr\": 0.03169380235712997\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6495412844036698,\n \"acc_stderr\": 0.02045607759982446,\n \"\
acc_norm\": 0.6495412844036698,\n \"acc_norm_stderr\": 0.02045607759982446\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3194444444444444,\n \"acc_stderr\": 0.03179876342176851,\n \"\
acc_norm\": 0.3194444444444444,\n \"acc_norm_stderr\": 0.03179876342176851\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5784313725490197,\n \"acc_stderr\": 0.03465868196380762,\n \"\
acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.03465868196380762\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5991561181434599,\n \"acc_stderr\": 0.03190080389473235,\n \
\ \"acc_norm\": 0.5991561181434599,\n \"acc_norm_stderr\": 0.03190080389473235\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5381165919282511,\n\
\ \"acc_stderr\": 0.033460150119732274,\n \"acc_norm\": 0.5381165919282511,\n\
\ \"acc_norm_stderr\": 0.033460150119732274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5190839694656488,\n \"acc_stderr\": 0.04382094705550988,\n\
\ \"acc_norm\": 0.5190839694656488,\n \"acc_norm_stderr\": 0.04382094705550988\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.04792898170907061,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.04792898170907061\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.0482572933735639,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.0482572933735639\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7136752136752137,\n\
\ \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.7136752136752137,\n\
\ \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6270753512132823,\n\
\ \"acc_stderr\": 0.01729286826945392,\n \"acc_norm\": 0.6270753512132823,\n\
\ \"acc_norm_stderr\": 0.01729286826945392\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5057803468208093,\n \"acc_stderr\": 0.026917296179149123,\n\
\ \"acc_norm\": 0.5057803468208093,\n \"acc_norm_stderr\": 0.026917296179149123\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961459,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961459\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.028629916715693413,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.028629916715693413\n \
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5691318327974276,\n\
\ \"acc_stderr\": 0.028125340983972714,\n \"acc_norm\": 0.5691318327974276,\n\
\ \"acc_norm_stderr\": 0.028125340983972714\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5308641975308642,\n \"acc_stderr\": 0.02776768960683393,\n\
\ \"acc_norm\": 0.5308641975308642,\n \"acc_norm_stderr\": 0.02776768960683393\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.32978723404255317,\n \"acc_stderr\": 0.0280459469420424,\n \
\ \"acc_norm\": 0.32978723404255317,\n \"acc_norm_stderr\": 0.0280459469420424\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.33833116036505867,\n\
\ \"acc_stderr\": 0.012084265626344202,\n \"acc_norm\": 0.33833116036505867,\n\
\ \"acc_norm_stderr\": 0.012084265626344202\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.030008562845003483,\n\
\ \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.030008562845003483\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4395424836601307,\n \"acc_stderr\": 0.020079420408087918,\n \
\ \"acc_norm\": 0.4395424836601307,\n \"acc_norm_stderr\": 0.020079420408087918\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4818181818181818,\n\
\ \"acc_stderr\": 0.04785964010794917,\n \"acc_norm\": 0.4818181818181818,\n\
\ \"acc_norm_stderr\": 0.04785964010794917\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.44081632653061226,\n \"acc_stderr\": 0.03178419114175363,\n\
\ \"acc_norm\": 0.44081632653061226,\n \"acc_norm_stderr\": 0.03178419114175363\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5572139303482587,\n\
\ \"acc_stderr\": 0.035123109641239346,\n \"acc_norm\": 0.5572139303482587,\n\
\ \"acc_norm_stderr\": 0.035123109641239346\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708311,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708311\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.43733395896519406,\n\
\ \"mc2_stderr\": 0.01449344801677889\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7261247040252565,\n \"acc_stderr\": 0.01253329273262029\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16148597422289612,\n \
\ \"acc_stderr\": 0.01013595945213431\n }\n}\n```"
repo_url: https://huggingface.co/decruz07/llama-2-7b-miniguanaco
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|arc:challenge|25_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|arc:challenge|25_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|gsm8k|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|gsm8k|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hellaswag|10_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hellaswag|10_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T16-19-24.687449.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T16-23-25.560074.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T16-23-25.560074.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- '**/details_harness|winogrande|5_2024-01-10T16-19-24.687449.parquet'
- split: 2024_01_10T16_23_25.560074
path:
- '**/details_harness|winogrande|5_2024-01-10T16-23-25.560074.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-10T16-23-25.560074.parquet'
- config_name: results
data_files:
- split: 2024_01_10T16_19_24.687449
path:
- results_2024-01-10T16-19-24.687449.parquet
- split: 2024_01_10T16_23_25.560074
path:
- results_2024-01-10T16-23-25.560074.parquet
- split: latest
path:
- results_2024-01-10T16-23-25.560074.parquet
---
# Dataset Card for Evaluation run of decruz07/llama-2-7b-miniguanaco
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [decruz07/llama-2-7b-miniguanaco](https://huggingface.co/decruz07/llama-2-7b-miniguanaco) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_decruz07__llama-2-7b-miniguanaco",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T16:23:25.560074](https://huggingface.co/datasets/open-llm-leaderboard/details_decruz07__llama-2-7b-miniguanaco/blob/main/results_2024-01-10T16-23-25.560074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4622895077291068,
"acc_stderr": 0.03447917370505138,
"acc_norm": 0.4668682561630037,
"acc_norm_stderr": 0.03525354072650985,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.43733395896519406,
"mc2_stderr": 0.01449344801677889
},
"harness|arc:challenge|25": {
"acc": 0.4522184300341297,
"acc_stderr": 0.014544519880633832,
"acc_norm": 0.4906143344709898,
"acc_norm_stderr": 0.014608816322065
},
"harness|hellaswag|10": {
"acc": 0.5611431985660227,
"acc_stderr": 0.004952332378120329,
"acc_norm": 0.7559251145190201,
"acc_norm_stderr": 0.004286594977390901
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5132075471698113,
"acc_stderr": 0.030762134874500482,
"acc_norm": 0.5132075471698113,
"acc_norm_stderr": 0.030762134874500482
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4930555555555556,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.4930555555555556,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3815028901734104,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.3815028901734104,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835362,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835362
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.023068188848261128,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.023068188848261128
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604675,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604675
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5225806451612903,
"acc_stderr": 0.02841498501970786,
"acc_norm": 0.5225806451612903,
"acc_norm_stderr": 0.02841498501970786
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3891625615763547,
"acc_stderr": 0.03430462416103872,
"acc_norm": 0.3891625615763547,
"acc_norm_stderr": 0.03430462416103872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.03872592983524754,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.03872592983524754
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5505050505050505,
"acc_stderr": 0.035441324919479704,
"acc_norm": 0.5505050505050505,
"acc_norm_stderr": 0.035441324919479704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6994818652849741,
"acc_stderr": 0.0330881859441575,
"acc_norm": 0.6994818652849741,
"acc_norm_stderr": 0.0330881859441575
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.38974358974358975,
"acc_stderr": 0.024726967886647078,
"acc_norm": 0.38974358974358975,
"acc_norm_stderr": 0.024726967886647078
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514568,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514568
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3907563025210084,
"acc_stderr": 0.03169380235712997,
"acc_norm": 0.3907563025210084,
"acc_norm_stderr": 0.03169380235712997
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6495412844036698,
"acc_stderr": 0.02045607759982446,
"acc_norm": 0.6495412844036698,
"acc_norm_stderr": 0.02045607759982446
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.03179876342176851,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.03179876342176851
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.03465868196380762,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.03465868196380762
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5991561181434599,
"acc_stderr": 0.03190080389473235,
"acc_norm": 0.5991561181434599,
"acc_norm_stderr": 0.03190080389473235
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5381165919282511,
"acc_stderr": 0.033460150119732274,
"acc_norm": 0.5381165919282511,
"acc_norm_stderr": 0.033460150119732274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5190839694656488,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.5190839694656488,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907061,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907061
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5030674846625767,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.5030674846625767,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.0482572933735639,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.0482572933735639
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7136752136752137,
"acc_stderr": 0.02961432369045665,
"acc_norm": 0.7136752136752137,
"acc_norm_stderr": 0.02961432369045665
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6270753512132823,
"acc_stderr": 0.01729286826945392,
"acc_norm": 0.6270753512132823,
"acc_norm_stderr": 0.01729286826945392
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5057803468208093,
"acc_stderr": 0.026917296179149123,
"acc_norm": 0.5057803468208093,
"acc_norm_stderr": 0.026917296179149123
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961459,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961459
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5,
"acc_stderr": 0.028629916715693413,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028629916715693413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5691318327974276,
"acc_stderr": 0.028125340983972714,
"acc_norm": 0.5691318327974276,
"acc_norm_stderr": 0.028125340983972714
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5308641975308642,
"acc_stderr": 0.02776768960683393,
"acc_norm": 0.5308641975308642,
"acc_norm_stderr": 0.02776768960683393
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32978723404255317,
"acc_stderr": 0.0280459469420424,
"acc_norm": 0.32978723404255317,
"acc_norm_stderr": 0.0280459469420424
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.33833116036505867,
"acc_stderr": 0.012084265626344202,
"acc_norm": 0.33833116036505867,
"acc_norm_stderr": 0.012084265626344202
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.030008562845003483,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.030008562845003483
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4395424836601307,
"acc_stderr": 0.020079420408087918,
"acc_norm": 0.4395424836601307,
"acc_norm_stderr": 0.020079420408087918
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4818181818181818,
"acc_stderr": 0.04785964010794917,
"acc_norm": 0.4818181818181818,
"acc_norm_stderr": 0.04785964010794917
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.44081632653061226,
"acc_stderr": 0.03178419114175363,
"acc_norm": 0.44081632653061226,
"acc_norm_stderr": 0.03178419114175363
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5572139303482587,
"acc_stderr": 0.035123109641239346,
"acc_norm": 0.5572139303482587,
"acc_norm_stderr": 0.035123109641239346
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.43733395896519406,
"mc2_stderr": 0.01449344801677889
},
"harness|winogrande|5": {
"acc": 0.7261247040252565,
"acc_stderr": 0.01253329273262029
},
"harness|gsm8k|5": {
"acc": 0.16148597422289612,
"acc_stderr": 0.01013595945213431
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Mindshift/ascen | ---
license: openrail
---
|
Aj901842/Emersonxx | ---
license: openrail
---
|
Solshine/Native_American_Treaty_Table_Formatted_Autotrain | ---
license: mit
---
Native American Treaty Table from Wikipedia, 2023, cleaned and joined. |
Codec-SUPERB/maestro_extract_unit | ---
configs:
- config_name: default
data_files:
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k
path: data/encodec_24k-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: academicodec_hifi_16k_320d
num_bytes: 35536775
num_examples: 185
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 35536775
num_examples: 185
- name: academicodec_hifi_24k_320d
num_bytes: 53296775
num_examples: 185
- name: audiodec_24k_320d
num_bytes: 113683735
num_examples: 185
- name: dac_16k
num_bytes: 217405095
num_examples: 185
- name: dac_24k
num_bytes: 603593335
num_examples: 185
- name: dac_44k
num_bytes: 178401915
num_examples: 185
- name: encodec_24k
num_bytes: 26658255
num_examples: 185
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 284244855
num_examples: 185
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 284244855
num_examples: 185
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 284244855
num_examples: 185
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 142164855
num_examples: 185
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 284244855
num_examples: 185
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 284244855
num_examples: 185
- name: speech_tokenizer_16k
num_bytes: 71071575
num_examples: 185
download_size: 455379990
dataset_size: 2898573365
---
# Dataset Card for "maestro_extract_unit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pharaouk/biology_dataset_standardized_cluster_2 | ---
dataset_info:
features: []
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 324
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "biology_dataset_standardized_cluster_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Swisslex__Mixtral-Orca-v0.1 | ---
pretty_name: Evaluation run of Swisslex/Mixtral-Orca-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Swisslex/Mixtral-Orca-v0.1](https://huggingface.co/Swisslex/Mixtral-Orca-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Swisslex__Mixtral-Orca-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-15T21:13:17.458135](https://huggingface.co/datasets/open-llm-leaderboard/details_Swisslex__Mixtral-Orca-v0.1/blob/main/results_2024-01-15T21-13-17.458135.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6595098527970858,\n\
\ \"acc_stderr\": 0.03195828654224696,\n \"acc_norm\": 0.6650394576540025,\n\
\ \"acc_norm_stderr\": 0.0326025270767851,\n \"mc1\": 0.4602203182374541,\n\
\ \"mc1_stderr\": 0.017448017223960877,\n \"mc2\": 0.6385028598239161,\n\
\ \"mc2_stderr\": 0.01556463307062193\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6774744027303754,\n \"acc_stderr\": 0.013659980894277371,\n\
\ \"acc_norm\": 0.697098976109215,\n \"acc_norm_stderr\": 0.013428241573185349\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7167894841665007,\n\
\ \"acc_stderr\": 0.0044963697421321076,\n \"acc_norm\": 0.8887671778530173,\n\
\ \"acc_norm_stderr\": 0.003137776444277206\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n\
\ \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5263157894736842,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.5263157894736842,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\"\
: 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941183,\n\
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941183\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.02934457250063435,\n \
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.02934457250063435\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"\
acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8568807339449541,\n \"acc_stderr\": 0.015014462497168585,\n \"\
acc_norm\": 0.8568807339449541,\n \"acc_norm_stderr\": 0.015014462497168585\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671632,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671632\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7219730941704036,\n\
\ \"acc_stderr\": 0.030069584874494043,\n \"acc_norm\": 0.7219730941704036,\n\
\ \"acc_norm_stderr\": 0.030069584874494043\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724147,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724147\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.02126271940040698,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.02126271940040698\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8492975734355045,\n\
\ \"acc_stderr\": 0.012793420883120807,\n \"acc_norm\": 0.8492975734355045,\n\
\ \"acc_norm_stderr\": 0.012793420883120807\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4793296089385475,\n\
\ \"acc_stderr\": 0.016708205559996137,\n \"acc_norm\": 0.4793296089385475,\n\
\ \"acc_norm_stderr\": 0.016708205559996137\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.024404394928087877,\n\
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.024404394928087877\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7620578778135049,\n\
\ \"acc_stderr\": 0.024185150647818707,\n \"acc_norm\": 0.7620578778135049,\n\
\ \"acc_norm_stderr\": 0.024185150647818707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7746913580246914,\n \"acc_stderr\": 0.02324620264781975,\n\
\ \"acc_norm\": 0.7746913580246914,\n \"acc_norm_stderr\": 0.02324620264781975\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4954367666232073,\n\
\ \"acc_stderr\": 0.012769704263117526,\n \"acc_norm\": 0.4954367666232073,\n\
\ \"acc_norm_stderr\": 0.012769704263117526\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7610294117647058,\n \"acc_stderr\": 0.02590528064489301,\n\
\ \"acc_norm\": 0.7610294117647058,\n \"acc_norm_stderr\": 0.02590528064489301\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"\
acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\"\
: 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\":\
\ {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.026640582539133203,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.026640582539133203\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4602203182374541,\n\
\ \"mc1_stderr\": 0.017448017223960877,\n \"mc2\": 0.6385028598239161,\n\
\ \"mc2_stderr\": 0.01556463307062193\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.010995172318019806\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3730098559514784,\n \
\ \"acc_stderr\": 0.013320876609777215\n }\n}\n```"
repo_url: https://huggingface.co/Swisslex/Mixtral-Orca-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|arc:challenge|25_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|gsm8k|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hellaswag|10_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T21-13-17.458135.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T21-13-17.458135.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- '**/details_harness|winogrande|5_2024-01-15T21-13-17.458135.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-15T21-13-17.458135.parquet'
- config_name: results
data_files:
- split: 2024_01_15T21_13_17.458135
path:
- results_2024-01-15T21-13-17.458135.parquet
- split: latest
path:
- results_2024-01-15T21-13-17.458135.parquet
---
# Dataset Card for Evaluation run of Swisslex/Mixtral-Orca-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Swisslex/Mixtral-Orca-v0.1](https://huggingface.co/Swisslex/Mixtral-Orca-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Swisslex__Mixtral-Orca-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T21:13:17.458135](https://huggingface.co/datasets/open-llm-leaderboard/details_Swisslex__Mixtral-Orca-v0.1/blob/main/results_2024-01-15T21-13-17.458135.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6595098527970858,
"acc_stderr": 0.03195828654224696,
"acc_norm": 0.6650394576540025,
"acc_norm_stderr": 0.0326025270767851,
"mc1": 0.4602203182374541,
"mc1_stderr": 0.017448017223960877,
"mc2": 0.6385028598239161,
"mc2_stderr": 0.01556463307062193
},
"harness|arc:challenge|25": {
"acc": 0.6774744027303754,
"acc_stderr": 0.013659980894277371,
"acc_norm": 0.697098976109215,
"acc_norm_stderr": 0.013428241573185349
},
"harness|hellaswag|10": {
"acc": 0.7167894841665007,
"acc_stderr": 0.0044963697421321076,
"acc_norm": 0.8887671778530173,
"acc_norm_stderr": 0.003137776444277206
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0274796030105388,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0274796030105388
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941183,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941183
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.02934457250063435,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.02934457250063435
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8568807339449541,
"acc_stderr": 0.015014462497168585,
"acc_norm": 0.8568807339449541,
"acc_norm_stderr": 0.015014462497168585
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671632,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671632
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7219730941704036,
"acc_stderr": 0.030069584874494043,
"acc_norm": 0.7219730941704036,
"acc_norm_stderr": 0.030069584874494043
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724147,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724147
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.02126271940040698,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.02126271940040698
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8492975734355045,
"acc_stderr": 0.012793420883120807,
"acc_norm": 0.8492975734355045,
"acc_norm_stderr": 0.012793420883120807
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4793296089385475,
"acc_stderr": 0.016708205559996137,
"acc_norm": 0.4793296089385475,
"acc_norm_stderr": 0.016708205559996137
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.024404394928087877,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.024404394928087877
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7620578778135049,
"acc_stderr": 0.024185150647818707,
"acc_norm": 0.7620578778135049,
"acc_norm_stderr": 0.024185150647818707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7746913580246914,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.7746913580246914,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4954367666232073,
"acc_stderr": 0.012769704263117526,
"acc_norm": 0.4954367666232073,
"acc_norm_stderr": 0.012769704263117526
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7610294117647058,
"acc_stderr": 0.02590528064489301,
"acc_norm": 0.7610294117647058,
"acc_norm_stderr": 0.02590528064489301
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065677,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.026640582539133203,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.026640582539133203
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4602203182374541,
"mc1_stderr": 0.017448017223960877,
"mc2": 0.6385028598239161,
"mc2_stderr": 0.01556463307062193
},
"harness|winogrande|5": {
"acc": 0.8113654301499605,
"acc_stderr": 0.010995172318019806
},
"harness|gsm8k|5": {
"acc": 0.3730098559514784,
"acc_stderr": 0.013320876609777215
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
distilled-from-one-sec-cv12/chunk_248 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 926120720
num_examples: 180460
download_size: 943547158
dataset_size: 926120720
---
# Dataset Card for "chunk_248"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mokello/samelin | ---
license: afl-3.0
---
|
shossain/merged-no-pad-text-65536 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 372419841
num_examples: 1478
download_size: 179446670
dataset_size: 372419841
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "merged-no-pad-text-65536"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
InstaDeepAI/ms_ninespecies_benchmark | ---
license: cc0-1.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: sequence
dtype: string
- name: modified_sequence
dtype: string
- name: precursor_mz
dtype: float64
- name: precursor_charge
dtype: int64
- name: mz_array
sequence: float64
- name: intensity_array
sequence: float32
splits:
- name: train
num_bytes: 839098224
num_examples: 499402
- name: validation
num_bytes: 49792990
num_examples: 28572
- name: test
num_bytes: 45505134
num_examples: 27142
download_size: 1119691599
dataset_size: 934396348
---
# Dataset Card for Nine-Species excluding Yeast
Dataset used for the baseline comparison of InstaNovo to other models.
## Dataset Description
- **Repository:** [InstaNovo](https://github.com/instadeepai/InstaNovo)
- **Paper:** [De novo peptide sequencing with InstaNovo: Accurate, database-free peptide identification for large scale proteomics experiments](https://www.biorxiv.org/content/10.1101/2023.08.30.555055v1)
### Dataset Summary
Dataset used in the original [DeepNovo](https://www.pnas.org/doi/full/10.1073/pnas.1705691114) paper.
- The training set contains 8 species excluding yeast
- The validation/test set contains the yeast species
## Dataset Structure
The dataset is tabular, where each row corresponds to a labelled MS2 spectra.
- `sequence (string)` \
The target peptide sequence excluding post-translational modifications
- `modified_sequence (string)` \
The target peptide sequence including post-translational modifications
- `precursor_mz (float64)` \
The mass-to-charge of the precursor (from MS1)
- `charge (int64)` \
The charge of the precursor (from MS1)
- `mz_array (list[float64])` \
The mass-to-charge values of the MS2 spectrum
- `mz_array (list[float32])` \
The intensity values of the MS2 spectrum
## Citation Information
If you use this dataset, please cite the original authors.
The original data is available on [MASSIVE](https://massive.ucsd.edu/ProteoSAFe/static/massive.jsp) with the identifier `MSV000081382`.
Please also cite InstaNovo:
```bibtex
@article{eloff_kalogeropoulos_2023_instanovo,
title = {De novo peptide sequencing with InstaNovo: Accurate, database-free peptide identification for large scale proteomics experiments},
author = {Kevin Eloff and Konstantinos Kalogeropoulos and Oliver Morell and Amandla Mabona and Jakob Berg Jespersen and Wesley Williams and Sam van Beljouw and Marcin Skwark and Andreas Hougaard Laustsen and Stan J. J. Brouns and Anne Ljungars and Erwin Marten Schoof and Jeroen Van Goey and Ulrich auf dem Keller and Karim Beguir and Nicolas Lopez Carranza and Timothy Patrick Jenkins},
year = {2023},
doi = {10.1101/2023.08.30.555055},
publisher = {Cold Spring Harbor Laboratory},
URL = {https://www.biorxiv.org/content/10.1101/2023.08.30.555055v1},
journal = {bioRxiv}
}
``` |
DBQ/Gucci.Product.prices.United.States | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: string
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 2746363
num_examples: 5764
download_size: 807752
dataset_size: 2746363
---
# Gucci web scraped data
## About the website
The **luxury fashion industry** in the **United States** is a rapidly evolving market, with a significant presence of globally renowned brands such as **Gucci**. In the recent years, the industry has channelled its efforts towards online platforms, leading to an increase in **ecommerce** activity. This transition has played a crucial role in boosting the accessibility and convenience of purchasing high-end fashion products. Of particular interest is the **Ecommerce product-list page (PLP) data on Gucci** in the United States. This dataset provides invaluable insights into online fashion consumer behavior, purchase patterns, and product preference trends.
## Link to **dataset**
[United States - Gucci - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Gucci%20Product-prices%20United%20States/r/rec9JLoIyDa81dIck)
|
BelleGroup/school_math_0.25M | ---
license: gpl-3.0
task_categories:
- text2text-generation
language:
- zh
size_categories:
- 100K<n<1M
---
# School Math 0.25M
## 内容
包含约25万条由[BELLE](https://github.com/LianjiaTech/BELLE)项目生成的中文数学题数据,包含解题过程。
注意:此数据集是由ChatGPT产生的,未经过严格校验,题目或解题过程可能包含错误。使用过程中请注意这一点。
## 样例
```
{
"instruction": "题目:小华手里有一个装满糖果的袋子,共有12个,小明想知道里面有多少颗糖果,于是他问小华:“你手里的糖果袋子里有偶数个糖果吗?”小华回答:“有,而且多于10颗。”请问小华手里的糖果袋子里最少有几颗糖果?",
"input": "",
"output": "\n由题目可知:小华手里的糖果袋子里有偶数个糖果;\n又知道小华手里的糖果袋子里有多于10颗糖果。\n因为糖果数为偶数,多于10颗,所以糖果数最小必须是12。\n所以小华手里的糖果袋子里最少有12颗糖果。"
}
```
### 字段
```
instruction: 指令
input: 输入(本数据集均为空)
output: 输出
```
## 局限性和使用限制
我们要求开发者仅将我们开源的代码、数据、模型及后续衍生物用于研究目的,不得用于商业,以及其他会对社会带来危害的用途。
由于数据是由*ChatGPT*生成的,未经严格验证,在事实性和其他方面还存在一些不足。因此,在使用此数据集时,请务必注意甄别。
本数据集不代表任何一方的立场、利益或想法,无关任何团体的任何类型的主张。因使用本数据集带来的任何损害、纠纷,本项目的开发者不承担任何责任。
***
# School Math 0.25M
## Contents
Includes approx. 250k Chinese school math quizzes and answers generated by BELLE, with steps to solve the quizzes.
Note: this subset was generated by *ChatGPT* and was not strictly verified. The quizzes or answers might contain errors. Please take this in mind when using this subset.
## Sample
```
{
"instruction": "题目:小华手里有一个装满糖果的袋子,共有12个,小明想知道里面有多少颗糖果,于是他问小华:“你手里的糖果袋子里有偶数个糖果吗?”小华回答:“有,而且多于10颗。”请问小华手里的糖果袋子里最少有几颗糖果?",
"input": "",
"output": "\n由题目可知:小华手里的糖果袋子里有偶数个糖果;\n又知道小华手里的糖果袋子里有多于10颗糖果。\n因为糖果数为偶数,多于10颗,所以糖果数最小必须是12。\n所以小华手里的糖果袋子里最少有12颗糖果。"
}
```
### Schema
```
instruction: 指令
input: 输入(本数据集均为空)
output: 输出
```
## Limitation and Usage Limits
We require developers only use the open-sourced code, data, model and any other artifacts generated via this project for research purposes. Commercial use and other potential harmful use cases are not allowed.
Since this dataset was generated by *ChatGPT* and was not strictly verified, it still has shortcomings regarding factuality and other aspects. When using this dataset, careful inspection is needed.
This dataset does not represent anyone's ground, interest or thought, and is not related to any kind of claim of any groups. The developers of this project do not assume any responsibility to potential harm inflicted by using this dataset and project. |
gaizerick/kai | ---
license: openrail
---
|
ap0009/decomp_dataset | ---
license: apache-2.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.